Gaussian Processes for Big Data

James HensmanNicolò FusiNeil D. Lawrence
Uncertainty in Artificial Intelligence, AUAI Press 29, 2013.

Abstract

We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our approach is readily extended to models with non-Gaussian likelihoods and latent variable models based around Gaussian processes. We demonstrate the approach on a simple toy problem and two real world data sets.

Cite this Paper


BibTeX
@InProceedings{Hensman-bigdata13, title = {{G}aussian Processes for Big Data}, author = {Hensman, James and Fusi, Nicolò and Lawrence, Neil D.}, booktitle = {Uncertainty in Artificial Intelligence}, year = {2013}, editor = {Nicholson, Ann and Smyth, Padhraic}, volume = {29}, publisher = {AUAI Press}, pdf = {http://auai.org/uai2013/prints/papers/244.pdf}, url = {http://inverseprobability.com/publications/gaussian-processes-for-big-data.html}, abstract = {We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our approach is readily extended to models with non-Gaussian likelihoods and latent variable models based around Gaussian processes. We demonstrate the approach on a simple toy problem and two real world data sets. } }
Endnote
%0 Conference Paper %T Gaussian Processes for Big Data %A James Hensman %A Nicolò Fusi %A Neil D. Lawrence %B Uncertainty in Artificial Intelligence %D 2013 %E Ann Nicholson %E Padhraic Smyth %F Hensman-bigdata13 %I AUAI Press %U http://inverseprobability.com/publications/gaussian-processes-for-big-data.html %V 29 %X We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our approach is readily extended to models with non-Gaussian likelihoods and latent variable models based around Gaussian processes. We demonstrate the approach on a simple toy problem and two real world data sets.
RIS
TY - CPAPER TI - Gaussian Processes for Big Data AU - James Hensman AU - Nicolò Fusi AU - Neil D. Lawrence BT - Uncertainty in Artificial Intelligence DA - 2013/07/11 ED - Ann Nicholson ED - Padhraic Smyth ID - Hensman-bigdata13 PB - AUAI Press VL - 29 L1 - http://auai.org/uai2013/prints/papers/244.pdf UR - http://inverseprobability.com/publications/gaussian-processes-for-big-data.html AB - We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our approach is readily extended to models with non-Gaussian likelihoods and latent variable models based around Gaussian processes. We demonstrate the approach on a simple toy problem and two real world data sets. ER -
APA
Hensman, J., Fusi, N. & Lawrence, N.D.. (2013). Gaussian Processes for Big Data. Uncertainty in Artificial Intelligence 29 Available from http://inverseprobability.com/publications/gaussian-processes-for-big-data.html.

Related Material