# Bayesian Regression

Bayesian Inference Lecture Slides.

## Lab Class

To obtain the lab class in ipython notebook, first open the ipython notebook. Then paste the following code into the ipython notebook

``````import urllib
urllib.urlretrieve('https://github.com/SheffieldML/notebook/blob/master/lab_classes/machine_learning/MLAI_lab4.ipynb', 'MLAI_lab4.ipynb')
``````

You should now be able to find the lab class by clicking `File->Open` on the ipython notebook menu.

There is a YouTube video available of me giving this material at the Gaussian Process Road Show in Uganda.

#### GPRS Uganda Video

Second half overlaps with the material from this week’s lectures.

#### Video from 2011 on Gaussian Densities and Bayesian Inference

• Rogers and Girolami Chapter 3: Bayesian Methods Section 3.1-3.3 (pg 95-117)
• Sections 1.2.3 (pg 21-24) of Bishop
• Sections 1.2.6 (start from just past equ 1.64, pg 30-32) of Bishop
• Section 2.3 of Bishop up to top of pg 85 (multivariate Gaussians).
• Section 3.3 of Bishop up to pg 159 (pg 152-159). (Bayesian linear regression)
• Sections 3.7-3.8 of Rogers and Girolami (pg 122-133).
• Section 3.4 of Bishop (pg 161-165).

### Previous Lectures

Univariate Bayesian Inference

\

Multivariate Bayesian Inference

\

Bayesian Polynomials on Olympics Data

## Learning Outcomes Week 5

• Understand the principal of integrating parameters and how to use Bayes rule to do so.
• Understand the role of the prior distribution.
• In multivariate and univariate Gaussian examples, be able to combine the prior with the likelihood to form a posterior distribution..
• Recognise the role of the marginal likelihood and know its form for Bayesian regression under Gaussian priors.
• Be able to compute the expected output of the model and its covariance using the posterior distribution and the formula for the function.
• Understand the effect of model averaging and its advantages when making predictions including:
• Error bars
• Regularized prediction (reduces variance)