Math Tutorials¶
Detailed mathematical notes, derivations, and tutorials to supplement lectures.
Getting Started¶
Tutorial 1: Mathematical Foundations and Terminology¶
Essential notation, terminology, and basic concepts you need to understand before starting the course.
What you'll learn:
- Mathematical notation and symbols (\(\in, \forall, \exists, \sum, \prod\))
- Number systems (\(\mathbb{N}, \mathbb{Z}, \mathbb{R}, \mathbb{C}\))
- Vectors and matrices basics
- Essential operations (dot product, matrix multiplication)
- Linear independence, span, and basis
- Norms and distance
- Practice problems with solutions
Lecture Tutorials¶
Tutorial 2: Linear Algebra¶
The computational foundation of machine learning — systems of equations, vector spaces, and linear mappings.
What you'll learn:
- Systems of linear equations and their geometric interpretation
- Matrices: operations, inverse, transpose
- Gaussian elimination and row echelon form
- Vector spaces, subspaces, and the 3-step subspace test
- Linear independence, span, basis, and dimension
- Rank, kernel, and image of linear mappings
- Rank-Nullity Theorem
- Change of basis and affine spaces
- 8 practice problems with solutions
Tutorial 3: Analytic Geometry¶
The geometry behind machine learning — norms, inner products, projections, and the Gram-Schmidt process.
What you'll learn:
- Norms (\(\ell_1\), \(\ell_2\), \(\ell_\infty\)) and their properties
- Inner products and positive definite matrices
- Cauchy-Schwarz inequality and distances
- Angles, orthogonality, and orthonormal bases
- Orthogonal complement and its relation to the kernel
- Projections onto lines and general subspaces
- Gram-Schmidt orthogonalization process
- Rotation matrices and their properties
- Practice problems with solutions
Tutorial 4: Matrix Decomposition¶
Reveals the hidden structure in matrices — eigenvalues, SVD, and matrix approximation.
What you'll learn:
- Determinants and trace: computation and properties
- Eigenvalues and eigenvectors: characteristic polynomial, computation
- Cholesky decomposition for positive definite matrices
- Eigendecomposition and diagonalization
- Singular Value Decomposition (SVD): geometric interpretation
- Matrix approximation via truncated SVD
- Practice problems with solutions
Tutorial 5: Vector Calculus¶
The mathematics of learning — from derivatives to backpropagation.
What you'll learn:
- Differentiation rules and Taylor series
- Partial derivatives and gradients
- Jacobians for vector-valued functions
- Matrix calculus and useful gradient identities
- Chain rule for multivariate functions
- Backpropagation and computation graphs
- Hessian matrix and second-order methods
- Practice problems with solutions
Tutorial 6: Probability and Distributions¶
The language of uncertainty — from sample spaces to Gaussian distributions.
What you'll learn:
- Probability spaces and axioms (Kolmogorov)
- Conditional probability and Bayes' Theorem
- Discrete distributions (Bernoulli, Binomial, Geometric)
- Continuous distributions (Uniform, Exponential, Gaussian)
- Expected value, variance, and computation rules
- Joint and marginal distributions
- Covariance, correlation, and independence
- Common distributions reference table
- Practice problems with solutions
Tutorial Format¶
Each tutorial includes:
- Motivation - Why this topic matters for AI/ML
- Mathematical Theory - Rigorous treatment with definitions
- Worked Examples - Step-by-step solutions
- Practice Problems - With full solutions
- Key Takeaways - Summary of essential concepts
How to Use These Tutorials¶
Study Strategy:
- Read tutorial before corresponding lecture
- Work through examples by hand
- Attempt practice problems before checking solutions
- Use as reference during homework
- Review before exams
Tips:
- Don't skip the proofs — they build understanding
- Work examples yourself before checking solutions
- Connect abstract concepts to concrete ML applications
- Build your own formula sheet as you go
Tutorials curated by Mohammed Alnemari Mathematics of AI • Spring 2026
Last Updated: February 8, 2026