You can find the intro talk and notes here.
Also there’s a video of me giving similar material at the Machine Learning Summer School here.
If you want to review some of the background then the following might also be useful.
- Linear Algebra & Regression from my old Sheffield Course (with Jupyter notebook and video).
- Bayesian Regression from my old Sheffield course (also with Jupyter notebook and video) goes through some other important fundamentals like Bayes rule and multivariate Gaussians.
Here are the papers for the different sessions.
30th January 2020 at 16:00 in SW01, WGB
Variational Learning of Inducing Variables in Sparse Gaussian Processes AISTATS 2009 by Michalis Titsias
For some background, you can also check the first part of these notes and video from the machine learning summer school in Stellenbosch.
6th February 2020 at 16:00 in SW01, WGB
Gaussian Processes for Big Data UAI 2013 by James Hensman, Nicolò Fusi and Neil D. Lawrence.
13th February 2020 at 16:00 in SW01, WGB
Deep Gaussian Processes AISTATS 2013 by Andreas Damianou and Neil D. Lawrence.
For more background, check the second part of these notes and video from the machine learning summer school in Stellenbosch.
Linear Latent Force Models using Gaussian Processes by Mauricio Alvarez, David Luengo and Neil D. Lawrence in TPAMI.
For the mini-project I recommend one of four tasks.
Find an implementation of sparse variational Gaussian processes. Explore it’s properties for different input dimensionalities. You can use simulated data and real data. You might want your exploration to be two stage. One where you keep covariance function parameters fixed, and one where you vary the covariance function parameters.
Find an implementation of SVI for GPs. For a smaller dataset (where the standard sparse GP can also be used) explore the stochastic optimization routine. Try different patterns of update between the inducing variables and the covariance parameters. If possible, attempt to update the inducing point locations. Monitor the quality of results by comparing to the best-approximate GP as computed by the Titsias variational approach.
Use an implementation of Deep GPs to test different initialisation approaches. The implementation from my MLSS talk uses PCA for initializing latent spaces. What other approaches can you try?
Reimplement the latent force model covariance function in python. Be careful about numerical problems that arise in that reimplementation. Use your implementation to fit the latent force model to e.g. a small motion capture data set.