EE-612 / 4 credits
Remark: Next time: Spring 2025
Every 2 years
This course provides in-depth understanding of the most fundamental algorithms in statistical pattern recognition or machine learning (including Deep Learning) as well as concrete tools (as Python source code) to PhD students for their work.
This course will cover the broad regression, classification and probability distribution modeling methods and more particularly: Linear regression, Logistic regression, k-NN, Decision Trees, Boosting, Dimensionality reduction (PCA, LDA, t-SNE), k-Means, GMMs, MLPs, CNNs, SVMs.
A - Introduction
- Data representation,
- Pattern Recognition and Machine Learning,
- Lab preparation (JupyterHub, Python and pyTorch).
B - Regression and Classification
- Linear Regression,
- Logistic Regression and Regularization, Overfitting and Capacity,
- k-NN, Decision Trees,
- Artificial Neural Networks: Multi-Layer Perceptron (MLP) and Back-Propagation
- Deep Learning : Convolutional Neural Networks (CNN) and Optimization
- Support Vector Machines
C - Dimensionality reduction and Clustering
- Principal Component Analysis (PCA),
- Linear Discriminant Analysis (LDA),
- k-Means, Single Linkage,
D - Probability distribution modelling
- Gaussian Mixture Models (GMM) and the Expectation-Maximization (EM).
Pattern Recognition, Machine Learning, Linear models, PCA, LDA, MLP, SVM, GMM, HMM.
Linear algebra, Probabilities and Statistics, Signal Processing, Python (for the Labs).
Laboratory and oral exam.
In the programs
- Number of places: 24
- Exam form: Multiple (session free)
- Subject examined: Fundamentals in statistical pattern recognition
- Lecture: 36 Hour(s)
- Practical work: 20 Hour(s)