Deep Gaussian Processes

[edit]

at UCL-Duke University Workshop on Sensing and Analysis of High-Dimensional Data on Sep 4, 2014 [pdf]
Neil D. Lawrence, University of Sheffield

Videolectures

Abstract

In this talk we describe how deep neural networks can be modified to produce deep Gaussian process models. The framework of deep Gaussian processes allow for unsupervised learning, transfer learning, semi-supervised learning, multi-task learning and principled handling of different data types (count data, binary data, heavy tailed noise distributions). The main challenge is to solve these models efficiently for massive data sets. That challenge is in reach through a new class of variational approximations known as variational compression. The underlying variational bounds are very similar to the objective functions for deep neural networks, giving the promise of efficient approaches to deep learning that are constructed from components with very well understood analytical properties.

Links