Learning for Larger Datasets with the Gaussian Process Latent Variable Model

Neil D. Lawrence
Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics, Omnipress :243-250, 2007.

Abstract

In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GP-LVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.

Cite this Paper


BibTeX
@InProceedings{Lawrence:larger07, title = {Learning for Larger Datasets with the {G}aussian Process Latent Variable Model}, author = {Lawrence, Neil D.}, booktitle = {Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics}, pages = {243--250}, year = {2007}, editor = {Meila, Marina and Shen, Xiaotong}, address = {San Juan, Puerto Rico}, publisher = {Omnipress}, pdf = {http://proceedings.mlr.press/v2/lawrence07a/lawrence07a.pdf}, url = {http://inverseprobability.com/publications/lawrence-larger07.html}, abstract = {In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GP-LVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.} }
Endnote
%0 Conference Paper %T Learning for Larger Datasets with the Gaussian Process Latent Variable Model %A Neil D. Lawrence %B Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics %D 2007 %E Marina Meila %E Xiaotong Shen %F Lawrence:larger07 %I Omnipress %P 243--250 %U http://inverseprobability.com/publications/lawrence-larger07.html %X In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GP-LVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model.
RIS
TY - CPAPER TI - Learning for Larger Datasets with the Gaussian Process Latent Variable Model AU - Neil D. Lawrence BT - Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - Lawrence:larger07 PB - Omnipress SP - 243 EP - 250 L1 - http://proceedings.mlr.press/v2/lawrence07a/lawrence07a.pdf UR - http://inverseprobability.com/publications/lawrence-larger07.html AB - In this paper we apply the latest techniques in sparse Gaussian process regression (GPR) to the Gaussian process latent variable model (GP-LVM). We review three techniques and discuss how they may be implemented in the context of the GP-LVM. Each approach is then implemented on a well known benchmark data set and compared with earlier attempts to sparsify the model. ER -
APA
Lawrence, N.D.. (2007). Learning for Larger Datasets with the Gaussian Process Latent Variable Model. Proceedings of the Eleventh International Workshop on Artificial Intelligence and Statistics:243-250 Available from http://inverseprobability.com/publications/lawrence-larger07.html.

Related Material