MATH-329 / 5 credits

Teacher: Boumal Nicolas

Language: English

## Summary

This course introduces students to continuous, nonlinear optimization. We study the theory of optimization with continuous variables (with full proofs), and we analyze and implement important algorithms to solve constrained and unconstrained problems.

## Required courses

Students are expected to be comfortable with linear algebra, analysis and mathematical proofs. Lectures, homework and the final exam are proof heavy.

Students are expected to be (or become) comfortable writing code in Matlab. They may be allowed to write some of their work in Python or Julia upon request. Homework requires a substantial amount of coding. We will not teach Matlab but there are a lot of resources online to help you.

## Learning Outcomes

By the end of the course, the student must be able to:

• Recognize and formulate a mathematical optimization problem
• Analyze and implement the gradient descent method, Newton's method, the trust-region method and the augmented Lagrangian method, among others.
• Establish and discuss local and global convergence guarantees for iterative algorithms.
• Exploit elementary notions of convexity and duality in optimization.
• Apply the general theory to particular cases.
• Prove some of the most important theorems studied in class.

## Teaching methods

Lectures + exercise sessions + extensive homework (in groups)

## Expected student activities

Students are expected to attend lectures and participate actively in class and exercises. Exercises include both theoretical work and programming assignments. Students also complete homework assignments that include theoretical and numerical work. The homework assignments require a substantial amount of work throughout the semester, and accordingly account for a substantial part of the final grade. They are done in groups.

## Assessment methods

Final exam (40%) + homework (60%)

The overall grade (computed as above) is rounded up or down to the closest quarter of a point: up if the (individual) exam grade is a passing grade, down if not.

## Supervision

 Office hours No Assistants Yes

## Bibliography

Book "Numerical Optimization", J. Nocedal and S. Wright, Springer 2006: https://link.springer.com/book/10.1007/978-0-387-40065-5

## Notes/Handbook

Lecture notes provided by the lecturer: https://www.nicolasboumal.net/papers/MATH329-Lecture_notes_Boumal_2023.htm

## In the programs

• Semester: Fall
• Exam form: Written (winter session)
• Subject examined: Continuous optimization
• Lecture: 2 Hour(s) per week x 14 weeks
• Exercises: 2 Hour(s) per week x 14 weeks

## Reference week

 Mo Tu We Th Fr 8-9 9-10 10-11 11-12 12-13 13-14 14-15 15-16 16-17 17-18 18-19 19-20 20-21 21-22

## Related courses

Results from graphsearch.epfl.ch.