Bayesian Gaussian Process Latent Variable Model

[edit]

Michalis K. Titsias, University of Athens
Neil D. Lawrence, University of Sheffield

in Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics 9, pp 844-851

Related Material

Abstract

We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality reduction problems, but the methodology is more general. For example, our algorithm is immediately applicable for training Gaussian process models in the presence of missing or uncertain inputs.


@InProceedings{titsias-bayesgplvm10,
  title = 	 {Bayesian Gaussian Process Latent Variable Model},
  author = 	 {Michalis K. Titsias and Neil D. Lawrence},
  booktitle = 	 {Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics},
  pages = 	 {844},
  year = 	 {2010},
  editor = 	 {Yee Whye Teh and D. Michael Titterington},
  volume = 	 {9},
  address = 	 {Chia Laguna Resort, Sardinia, Italy},
  month = 	 {00},
  publisher = 	 {JMLR W\&CP 9},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2010-01-01-titsias-bayesgplvm10.md},
  url =  	 {http://inverseprobability.com/publications/titsias-bayesgplvm10.html},
  abstract = 	 {We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality reduction problems, but the methodology is more general. For example, our algorithm is immediately applicable for training Gaussian process models in the presence of missing or uncertain inputs.},
  crossref =  {Teh:aistats10},
  key = 	 {Titsias:bayesGPLVM10},
  linkpdf = 	 {http://jmlr.csail.mit.edu/proceedings/papers/v9/titsias10a/titsias10a.pdf},
  linksoftware = {https://github.com/SheffieldML/vargplvm},
  OPTgroup = 	 {}
 

}
%T Bayesian Gaussian Process Latent Variable Model
%A Michalis K. Titsias and Neil D. Lawrence
%B 
%C Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics
%D 
%E Yee Whye Teh and D. Michael Titterington
%F titsias-bayesgplvm10
%I JMLR W\&CP 9	
%P 844--851
%R 
%U http://inverseprobability.com/publications/titsias-bayesgplvm10.html
%V 9
%X We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality reduction problems, but the methodology is more general. For example, our algorithm is immediately applicable for training Gaussian process models in the presence of missing or uncertain inputs.
TY  - CPAPER
TI  - Bayesian Gaussian Process Latent Variable Model
AU  - Michalis K. Titsias
AU  - Neil D. Lawrence
BT  - Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics
PY  - 2010/01/01
DA  - 2010/01/01
ED  - Yee Whye Teh
ED  - D. Michael Titterington	
ID  - titsias-bayesgplvm10
PB  - JMLR W\&CP 9	
SP  - 844
EP  - 851
L1  - http://jmlr.csail.mit.edu/proceedings/papers/v9/titsias10a/titsias10a.pdf
UR  - http://inverseprobability.com/publications/titsias-bayesgplvm10.html
AB  - We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality reduction problems, but the methodology is more general. For example, our algorithm is immediately applicable for training Gaussian process models in the presence of missing or uncertain inputs.
ER  -

Titsias, M.K. & Lawrence, N.D.. (2010). Bayesian Gaussian Process Latent Variable Model. Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics 9:844-851