Learning for Larger Datasets with the Gaussian Process Latent Variable Model

[edit]

Neil D. Lawrence, University of Sheffield

in Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics, pp 243-250

Related Material

Abstract

In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GP-LVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.


@InProceedings{lawrence-larger07,
  title = 	 {Learning for Larger Datasets with the Gaussian Process Latent Variable Model},
  author = 	 {Neil D. Lawrence},
  booktitle = 	 {Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics},
  pages = 	 {243},
  year = 	 {2007},
  editor = 	 {Marina Meila and Xiaotong Shen},
  address = 	 {San Juan, Puerto Rico},
  month = 	 {00},
  publisher = 	 {Omnipress},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2007-01-01-lawrence-larger07.md},
  url =  	 {http://inverseprobability.com/publications/lawrence-larger07.html},
  abstract = 	 {In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GP-LVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.},
  crossref =  {Meila:aistats07},
  key = 	 {Lawrence:larger07},
  linkpdf = 	 {ftp://ftp.dcs.shef.ac.uk/home/neil/gplvmLarger.pdf},
  linksoftware = {https://github.com/SheffieldML/GPmat/},
  group = 	 {shefml,gp,spgp,gplvm,dimensional reduction}
 

}
%T Learning for Larger Datasets with the Gaussian Process Latent Variable Model
%A Neil D. Lawrence
%B 
%C Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics
%D 
%E Marina Meila and Xiaotong Shen
%F lawrence-larger07
%I Omnipress	
%P 243--250
%R 
%U http://inverseprobability.com/publications/lawrence-larger07.html
%X In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GP-LVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.
TY  - CPAPER
TI  - Learning for Larger Datasets with the Gaussian Process Latent Variable Model
AU  - Neil D. Lawrence
BT  - Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics
PY  - 2007/01/01
DA  - 2007/01/01
ED  - Marina Meila
ED  - Xiaotong Shen	
ID  - lawrence-larger07
PB  - Omnipress	
SP  - 243
EP  - 250
L1  - ftp://ftp.dcs.shef.ac.uk/home/neil/gplvmLarger.pdf
UR  - http://inverseprobability.com/publications/lawrence-larger07.html
AB  - In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GP-LVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.
ER  -

Lawrence, N.D.. (2007). Learning for Larger Datasets with the Gaussian Process Latent Variable Model. Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics :243-250