# MLPR class notes

Keyboard shortcut: step through the notes using the left and right arrow keys.

Background information

• w0a – Course administration, pdf.
• w0b – Books useful for MLPR, htmlpdf.
• w0c – MLPR background self-test, htmlpdf. Answers: htmlpdf.
• w0d – Maths background for MLPR, htmlpdf.
• w0e – Programming in Python, htmlpdf.
• w0f – Expectations and sums of variables, htmlpdf.
• w0g – Notation, htmlpdf.

Week 1: Introduction to ML with Linear Regression

• w1a – Course Introduction, htmlpdf.
• w1b – Linear regression, htmlpdf.
• w1c – Plotting and understanding basis functions, htmlpdf.
• w1d – Linear regression, overfitting, and regularization, htmlpdf.

Week 2: ML fundamentals: generalization, error bars, Gaussians

• w2a – Training, Testing, and Evaluating Different Models, htmlpdf.
• w2b – Univariate Gaussians, htmlpdf.
• w2c – The Central Limit Theorem, htmlpdf.
• w2d – Error bars, htmlpdf.
• w2e – Multivariate Gaussians, htmlpdf.

Week 3: Classification and gradient-based fitting

• w3a – Classification: Regression, Gaussians, and pre-processing, htmlpdf.
• w3b – Regression and Gradients, htmlpdf.
• w3c – Logistic Regression, htmlpdf.

Week 4: Bayesian linear regression

• w4a – Bayesian regression, htmlpdf.
• w4b – Bayesian inference and prediction, htmlpdf.

Week 5: Bayesian model choice and Gaussian processes

Week 6: More detailed models: Gaussian process kernels, more non-Gaussian regression

• w6a – Gaussian Processes and Kernels, htmlpdf.
• w6b – Overview: fitting probabilistic models, htmlpdf.
• w6c – Softmax regression, htmlpdf.
• w6d – A robust logistic regression model, htmlpdf.

Week 8: Neural Networks

• w8a – Neural networks introduction, htmlpdf.
• w8b – Neural network architectures, htmlpdf.
• w8c – Fitting and initializing neural networks, htmlpdf.
• w8d – Backpropagation of Derivatives, htmlpdf.

Week 9: Autoencoders, PCA, Netflix Prize

• w9a – Autoencoders and Principal Components Analysis (PCA), htmlpdf.
• w9b – Netflix Prize, htmlpdf.

Week 10: Bayesian logistic regression, Laplace approximation

• w10a – Bayesian logistic regression, htmlpdf.
• w10b – The Laplace approximation applied to Bayesian logistic regression, htmlpdf.

Week 11: Sampling-based approximate Bayesian inference, variational inference

A coarse overview of major topics covered is below. Some principles aren't taught alone as they're useful in multiple contexts, such as gradient-based optimization, different regularization methods, ethics, and practical choices such as feature engineering or numerical implementation.

• Linear regression and ML introduction
• Evaluating and choosing methods from the zoo of possibilities
• Multivariate Gaussians
• Classification, generative and discriminative models
• Bayesian machine learning: linear regression, Gaussian processes and kernels
• Neural Networks
• Learning low-dimensional representations
• Approximate Inference: Bayesian logistic regression, Laplace, Variational
• Time allowing: applying what you know; other ways to combine and regularize models.

You are encouraged to write your own outlines and summaries of the course. Aim to make connections between topics, and imagine trying to explain to someone else what the main concepts of the course are.