When learning advanced material, you won't immediately understand everything
just from reading notes. *Please* sign up to the
forum, ask questions, and share insights and external materials that you
have discovered.

Lecture recordings and scans are on the Activities page.

Keyboard shortcut: step through the notes using the left and right arrow keys.

**Please annotate the HTML versions of the notes** in the forum, to keep
the class's comments together. You can show/hide links to PDF versions for printing. However,
the PDFs don't have the in-note questions.

Background information

- w0a – MLPR welcome and advice, html, pdf.
- w0b – Course administration and assessment, html, pdf.
- w0c – Books useful for MLPR, html, pdf.
- w0d – MLPR background self-test, html, pdf. Answers: html, pdf.
- w0e – Maths background for MLPR, html, pdf.
- w0f – Programming in Python, html, pdf.
- w0g – Expectations and sums of variables, html, pdf.
- w0h – Notation, html, pdf.

Week 1: Introduction to ML with Linear Regression

- w1a – Course Introduction, html, pdf.
- w1b – Linear regression, html, pdf.
- w1c – Plotting and understanding basis functions, html, pdf.
- w1d – Linear regression, overfitting, and regularization, html, pdf.

Week 2: ML fundamentals: generalization, error bars, Gaussians

- w2a – Training, Testing, and Evaluating Different Models, html, pdf.
- w2b – Univariate Gaussians, html, pdf.
- w2c – The Central Limit Theorem, html, pdf.
- w2d – Error bars, html, pdf.
- w2e – Multivariate Gaussians, html, pdf.

Week 3: Classification and gradient-based fitting

- w3a – Classification: Regression, Gaussians, and pre-processing, html, pdf.
- w3b – Regression and Gradients, html, pdf.
- w3c – Logistic Regression, html, pdf.

Week 4: Bayesian linear regression

Week 5: Bayesian model choice and Gaussian processes

- w5a – Bayesian model choice, html, pdf.
- A Bayesian linear regression demo
- w5b – Gaussian processes, html, pdf.
- A minimal GP demo
- Alternative GP demo

**Status Page:** Check which in-note questions you might have missed.

If you want to get a better idea of what we'll cover, and the style of the
notes, last year’s notes are available.
We'll cover mostly the same material.
**Please don't ask questions on last year's notes, but wait until they appear here.**

A coarse overview of major topics covered is below. Some principles aren't taught alone as they're useful in multiple contexts, such as gradient-based optimization, different regularization methods, ethics, and practical choices such as feature engineering or numerical implementation.

- Linear regression and ML introduction
- Evaluating and choosing methods from the zoo of possibilities
- Multivariate Gaussians
- Classification, generative and discriminative models
- Bayesian machine learning: linear regression, Gaussian processes and kernels
- Neural Networks
- Learning low-dimensional representations
- Approximate Inference: Bayesian logistic regression, Laplace, Variational

You are encouraged to write your own outlines and summaries of the course. Aim to make connections between topics, and imagine trying to explain to someone else what the main concepts of the course are.