MATH-520 / 5 credits

Teacher:

Language: English

Remark: Pas donné en 2024-25


Summary

Mathematical analysis of modern supervised machine learning techniques, from linear methods to artificial neural networks.

Content

  • Introduction (supervised learning, risk, error decomposition, over-fitting and capacity control + cross-validation, Bayes predictor for classification and regression)

  • Differentiable programming (backpropagation algorithm) and theoretical challenges posed by modern methods (large deep neural networks)

  • Statistical analysis of Empirical Risk Minimization (learning theory, from finite number of hypotheses to Rademacher / covering numbers)

  • First-order methods for optimization (gradient descent, stochastic gradient descent).

  • Kernel methods (positive-definite kernels and Reproducing Kernel Hilbert Spaces)

  • Algorithmic regularization of gradient descent (reparameterized models, least-squares, mirror descent, logistic loss and max-margin)

  • Dynamics of wide neural networks (parameterizations, neural tangent kernel and feature-learning limits)

 

Keywords

Supervised learning, Machine learning, Neural networks, Optimization, Statistics

Learning Prerequisites

Required courses

Analysis, Linear Algebra, Probability and Statistics

Important concepts to start the course

  • A good knowledge of undergraduate mathematics is important.

  • Ability to code in a scientific computing programming language of your choice (e.g. Python, Matlab, Julia). The course will involve coding exercises and assignments.

  • Having followed an introductory class on machine learning is beneficial.

 

Learning Outcomes

By the end of the course, the student must be able to:

  • Contextualise the research literature on theoretical machine learning
  • Interpret he practical behavior of complex machine learning pipelines through the lens of mathematical theory
  • Implement simple supervised learning algorithms from scratch
  • Reason on how statistical and optimization phenomena interact in machine learning practice
  • Distinguish between what is known and what is not known in the theory of deep learning

Assessment methods

Homeworks, projects, presentation

Resources

Moodle Link

In the programs

  • Semester: Fall
  • Exam form: During the semester (winter session)
  • Subject examined: Topics in machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: During the semester (winter session)
  • Subject examined: Topics in machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: During the semester (winter session)
  • Subject examined: Topics in machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: During the semester (winter session)
  • Subject examined: Topics in machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional

Reference week

Related courses

Results from graphsearch.epfl.ch.