Regression

For this week the lecture slides are available here.

YouTube Video

There is a YouTube video available of me giving this material at the Gaussian Process Road Show in Uganda.

You will need to watch this in HD to make the maths clearer.

Lab Class

Linear regression with numpy and Python.

The notebook for the lab class can be downloaded from here.

To obtain the lab class in ipython notebook, first open the ipython notebook. Then paste the following code into the ipython notebook

import urllib
urllib.urlretrieve('https://github.com/SheffieldML/notebook/blob/master/lab_classes/machine_learning/MLAI_lab2.ipynb', 'MLAI_lab2.ipynb')

You should now be able to find the lab class by clicking File->Open on the ipython notebook menu.

Reading

  • Reading (Regression)
    • Sections 1.1-1.3 of Rogers and Girolami.
    • Section 1.2.5 of Bishop up to Eq 1.65.
    • Section 1.1 of Bishop.
  • Reading (Matrix and Vector Review)
    • Section 1.3 of Rogers and Girolami.
  • Reading (Basis Functions)
    • Chapter 1, pg 1-6 of Bishop.
    • Section 1.4 of Rogers and Girolami.
    • Chapter 3, Section 3.1 of Bishop up to pg 143.

Learning Outcomes Week 2

Consolidate understanding of stages of a basic probabilistic machine learning:

  • Write down model.
  • Make an assumption about the errors.
  • Use combination of mathematical model and error assumptions to write down a likelihood
  • Maximize the likelihood with respect to the parameters of the model
  • Use the resulting model to make predictions.

Understand the principles of using gradient methods to find a fixed point equation to maximize a likelihoood.

Understand the weakness of coordinate descent methods when parameters are correlated.

Understand the advantages of using multivariate calculus to maximize the likelihood in linear regression.

Understand how basis functions can be used to go from linear models to non-linear models.