Course Overview
Advanced exploration of mathematical foundations critical to modern machine learning. The course combined theory, modeling, and applications to build deep intuition into fundamental matrix methods and optimization techniques.
Linear Transformations
- Abstract vector spaces
- Linear transformations
- Matrix representations
- Change of basis
Eigentheory
- Eigenvalues and eigenvectors
- Diagonalization
- Jordan canonical form
- Minimal polynomials
Advanced Decompositions
- Singular Value Decomposition (SVD)
- QR decomposition
- Polar decomposition
- Matrix factorizations
Matrix Methods
- Linear least squares methods
- Singular value decomposition (SVD)
- Eigenvalue decomposition
- Subspace methods and analysis
- Matrix factorization techniques
Optimization Techniques
- Stochastic gradient descent
- Alternating Direction Method of Multipliers (ADMM)
- Iteratively reweighted least squares
- Convergence analysis
- Regularization methods
Applications
- Principal Components Analysis (PCA)
- Image compression and denoising
- Low rank matrix completion
- Kernel ridge regression
- Spectral clustering
Implementation Projects
- Matrix factorization implementations
- Optimization algorithm comparisons
- Real-world application case studies
- Performance analysis and benchmarking
Theoretical Foundations
- Linear algebra fundamentals
- Convex optimization theory
- Statistical learning principles
- Computational complexity analysis