Local Distance Preservation in the GP-LVM through Back Constraints

Neil D. Lawrence, Joaquin Quiñonero Candela
,  23:513-520, 2006.

Abstract

The Gaussian process latent variable model (GP-LVM) is a generative approach to non-linear low dimensional embedding, that provides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) @Tipping:probpca99. While most approaches to non-linear dimensionality methods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keeping things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. We give illustrative experiments on common data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-lawrence-backconstraints06, title = {Local Distance Preservation in the GP-LVM through Back Constraints}, author = {Neil D. Lawrence and Joaquin Quiñonero Candela}, pages = {513--520}, year = {}, editor = {}, volume = {23}, url = {http://inverseprobability.com/publications/lawrence-backconstraints06.html}, abstract = {The Gaussian process latent variable model (GP-LVM) is a generative approach to non-linear low dimensional embedding, that provides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) @Tipping:probpca99. While most approaches to non-linear dimensionality methods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keeping things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. We give illustrative experiments on common data sets.} }
Endnote
%0 Conference Paper %T Local Distance Preservation in the GP-LVM through Back Constraints %A Neil D. Lawrence %A Joaquin Quiñonero Candela %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-lawrence-backconstraints06 %I PMLR %J Proceedings of Machine Learning Research %P 513--520 %R 10.1145/1143844.1143909 %U http://inverseprobability.com %V %W PMLR %X The Gaussian process latent variable model (GP-LVM) is a generative approach to non-linear low dimensional embedding, that provides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) @Tipping:probpca99. While most approaches to non-linear dimensionality methods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keeping things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. We give illustrative experiments on common data sets.
RIS
TY - CPAPER TI - Local Distance Preservation in the GP-LVM through Back Constraints AU - Neil D. Lawrence AU - Joaquin Quiñonero Candela BT - PY - DA - ED - ID - pmlr-v-lawrence-backconstraints06 PB - PMLR SP - 513 DP - PMLR EP - 520 DO - 10.1145/1143844.1143909 L1 - UR - http://inverseprobability.com/publications/lawrence-backconstraints06.html AB - The Gaussian process latent variable model (GP-LVM) is a generative approach to non-linear low dimensional embedding, that provides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) @Tipping:probpca99. While most approaches to non-linear dimensionality methods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keeping things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. We give illustrative experiments on common data sets. ER -
APA
Lawrence, N.D. & Quiñonero Candela, J.. (). Local Distance Preservation in the GP-LVM through Back Constraints. , in PMLR :513-520

Related Material