MATH-329 / 5 credits
Teacher: Boumal Nicolas
This course introduces students to continuous, nonlinear optimization. We study the theory of optimization with continuous variables (with full proofs), and we analyze and implement important algorithms to solve constrained and unconstrained problems.
Unconstrained optimization of differentiable functions
- Necessary optimality conditions
- The role of Lipschitz assumptions
- Gradient descent and Newton's method
- The trust-regions method, CG, truncated CG
Constrained optimization of differentiable functions
- Necessary optimality conditions, cones
- Projected gradient descent
- Notions of duality
- The quadratic penalty method
- The augmented Lagrangian method
Related topics and extensions may be included in lectures or though exercises / homework.
Note: precise contents may change during the semester, and from year to year.
Students are expected to be comfortable with linear algebra, analysis and mathematical proofs. Lectures, homework and the final exam are proof heavy.
Students are expected to be (or become) comfortable writing code in Matlab. They may be allowed to write some of their work in Python or Julia upon request. Homework requires a substantial amount of coding. We will not teach Matlab but there are a lot of resources online to help you.
By the end of the course, the student must be able to:
- Recognize and formulate a mathematical optimization problem
- Analyze and implement the gradient descent method, Newton's method, the trust-region method and the augmented Lagrangian method, among others.
- Establish and discuss local and global convergence guarantees for iterative algorithms.
- Exploit elementary notions of convexity and duality in optimization.
- Apply the general theory to particular cases.
- Prove some of the most important theorems studied in class.
Lectures + exercise sessions + extensive homework (in groups)
Expected student activities
Students are expected to attend lectures and participate actively in class and exercises. Exercises include both theoretical work and programming assignments. Students also complete homework assignments that include theoretical and numerical work. The homework assignments require a substantial amount of work throughout the semester, and accordingly account for a substantial part of the final grade. They are done in groups.
Final exam (40%) + homework (60%)
The overall grade (computed as above) is rounded up or down to the closest quarter of a point: up if the (individual) exam grade is a passing grade, down if not.
Book "Numerical Optimization", J. Nocedal and S. Wright, Springer 2006: https://link.springer.com/book/10.1007/978-0-387-40065-5
Ressources en bibliothèque
Lecture notes provided by the lecturer: https://www.nicolasboumal.net/papers/MATH329-Lecture_notes_Boumal_2023.htm
In the programs
- Semester: Fall
- Exam form: Written (winter session)
- Subject examined: Continuous optimization
- Lecture: 2 Hour(s) per week x 14 weeks
- Exercises: 2 Hour(s) per week x 14 weeks