Course Overview
Graduate-level study of numerical optimization methods and their theoretical foundations. Focus on theory and development of specific methods.
Optimization Fundamentals
- Optimality conditions
- Constraint qualification
- Convergence analysis
First-Order Methods
- Gradient descent algorithms
- Proximal methods
- Coordinate descent
- Momentum methods
- Adaptive learning rates
Second-Order Methods
- Newton's method
- Hessian Mtx. methods
Stochastic Methods
- Stochastic gradient descent
- Variance reduction techniques
Applications and Implementation
- Numerical linear algebra
- Machine learning optimization
- Scientific computing applications
- Large-scale optimization