Gaussian Process Models for Visualisation of High Dimensional Data

Neil D. Lawrence
,  16:329-336, 2004.

Abstract

In this paper we introduce a new underlying probabilistic model for principal component analysis (PCA). Our formulation interprets PCA as a particular Gaussian process prior on a mapping from a latent space to the observed data-space. We show that if the prior’s covariance function constrains the mappings to be linear the model is equivalent to PCA, we then extend the model by considering less restrictive covariance functions which allow non-linear mappings. This more general Gaussian process latent variable model (GPLVM) is then evaluated as an approach to the visualisation of high dimensional data for three different data-sets. Additionally our non-linear algorithm can be *further* kernelised leading to ‘twin kernel PCA’ in which a *mapping* *between feature spaces* occurs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-lawrence-gplvm03, title = {Gaussian Process Models for Visualisation of High Dimensional Data}, author = {Neil D. Lawrence}, pages = {329--336}, year = {}, editor = {}, volume = {16}, address = {Cambridge, MA}, url = {http://inverseprobability.com/publications/lawrence-gplvm03.html}, abstract = {In this paper we introduce a new underlying probabilistic model for principal component analysis (PCA). Our formulation interprets PCA as a particular Gaussian process prior on a mapping from a latent space to the observed data-space. We show that if the prior’s covariance function constrains the mappings to be linear the model is equivalent to PCA, we then extend the model by considering less restrictive covariance functions which allow non-linear mappings. This more general Gaussian process latent variable model (GPLVM) is then evaluated as an approach to the visualisation of high dimensional data for three different data-sets. Additionally our non-linear algorithm can be *further* kernelised leading to ‘twin kernel PCA’ in which a *mapping* *between feature spaces* occurs.} }
Endnote
%0 Conference Paper %T Gaussian Process Models for Visualisation of High Dimensional Data %A Neil D. Lawrence %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-lawrence-gplvm03 %I PMLR %J Proceedings of Machine Learning Research %P 329--336 %U http://inverseprobability.com %V %W PMLR %X In this paper we introduce a new underlying probabilistic model for principal component analysis (PCA). Our formulation interprets PCA as a particular Gaussian process prior on a mapping from a latent space to the observed data-space. We show that if the prior’s covariance function constrains the mappings to be linear the model is equivalent to PCA, we then extend the model by considering less restrictive covariance functions which allow non-linear mappings. This more general Gaussian process latent variable model (GPLVM) is then evaluated as an approach to the visualisation of high dimensional data for three different data-sets. Additionally our non-linear algorithm can be *further* kernelised leading to ‘twin kernel PCA’ in which a *mapping* *between feature spaces* occurs.
RIS
TY - CPAPER TI - Gaussian Process Models for Visualisation of High Dimensional Data AU - Neil D. Lawrence BT - PY - DA - ED - ID - pmlr-v-lawrence-gplvm03 PB - PMLR SP - 329 DP - PMLR EP - 336 L1 - UR - http://inverseprobability.com/publications/lawrence-gplvm03.html AB - In this paper we introduce a new underlying probabilistic model for principal component analysis (PCA). Our formulation interprets PCA as a particular Gaussian process prior on a mapping from a latent space to the observed data-space. We show that if the prior’s covariance function constrains the mappings to be linear the model is equivalent to PCA, we then extend the model by considering less restrictive covariance functions which allow non-linear mappings. This more general Gaussian process latent variable model (GPLVM) is then evaluated as an approach to the visualisation of high dimensional data for three different data-sets. Additionally our non-linear algorithm can be *further* kernelised leading to ‘twin kernel PCA’ in which a *mapping* *between feature spaces* occurs. ER -
APA
Lawrence, N.D.. (). Gaussian Process Models for Visualisation of High Dimensional Data. , in PMLR :329-336

Related Material