Local Distance Preservation in the GP-LVM through Back Constraints

[edit]

Neil D. Lawrence, University of Sheffield
Joaquin Quiñonero Candela

in Proceedings of the International Conference in Machine Learning 23, pp 513-520

Related Material

Errata

  • Equation (3) is missing integration over $\mathbf{f}$.
    Thanks to: Laurens van der Maaten
  • Equation after Equation (5), the $v_{ij}$ should be inside the sum sign.

Abstract

The Gaussian process latent variable model (GP-LVM) is a generative approach to non-linear low dimensional embedding, that provides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) @Tipping:probpca99. While most approaches to non-linear dimensionality methods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keeping things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. We give illustrative experiments on common data sets.


@InProceedings{lawrence-backconstraints06,
  title = 	 {Local Distance Preservation in the GP-LVM through Back Constraints},
  author = 	 {Neil D. Lawrence and Joaquin Quiñonero Candela},
  booktitle = 	 {Proceedings of the International Conference in Machine Learning},
  pages = 	 {513},
  year = 	 {2006},
  editor = 	 {William Cohen and Andrew Moore},
  volume = 	 {23},
  month = 	 {00},
  publisher = 	 {Omnipress},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2006-01-01-lawrence-backconstraints06.md},
  url =  	 {http://inverseprobability.com/publications/lawrence-backconstraints06.html},
  abstract = 	 {The Gaussian process latent variable model (GP-LVM) is a generative approach to non-linear low dimensional embedding, that provides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) @Tipping:probpca99. While most approaches to non-linear dimensionality methods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keeping things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. We give illustrative experiments on common data sets.},
  crossref =  {Cohen:icml06},
  key = 	 {Lawrence:backconstraints06},
  doi = 	 {10.1145/1143844.1143909},
  linkpdf = 	 {ftp://ftp.dcs.shef.ac.uk/home/neil/backConstraints.pdf},
  linksoftware = {https://github.com/SheffieldML/GPmat/},
  group = 	 {gplvm,dimensional reduction}
 

}
%T Local Distance Preservation in the GP-LVM through Back Constraints
%A Neil D. Lawrence and Joaquin Quiñonero Candela
%B 
%C Proceedings of the International Conference in Machine Learning
%D 
%E William Cohen and Andrew Moore
%F lawrence-backconstraints06
%I Omnipress	
%P 513--520
%R 10.1145/1143844.1143909
%U http://inverseprobability.com/publications/lawrence-backconstraints06.html
%V 23
%X The Gaussian process latent variable model (GP-LVM) is a generative approach to non-linear low dimensional embedding, that provides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) @Tipping:probpca99. While most approaches to non-linear dimensionality methods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keeping things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. We give illustrative experiments on common data sets.
TY  - CPAPER
TI  - Local Distance Preservation in the GP-LVM through Back Constraints
AU  - Neil D. Lawrence
AU  - Joaquin Quiñonero Candela
BT  - Proceedings of the International Conference in Machine Learning
PY  - 2006/01/01
DA  - 2006/01/01
ED  - William Cohen
ED  - Andrew Moore	
ID  - lawrence-backconstraints06
PB  - Omnipress	
SP  - 513
EP  - 520
DO  - 10.1145/1143844.1143909
L1  - ftp://ftp.dcs.shef.ac.uk/home/neil/backConstraints.pdf
UR  - http://inverseprobability.com/publications/lawrence-backconstraints06.html
AB  - The Gaussian process latent variable model (GP-LVM) is a generative approach to non-linear low dimensional embedding, that provides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) @Tipping:probpca99. While most approaches to non-linear dimensionality methods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keeping things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduction techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally preserve local distances. We give illustrative experiments on common data sets.
ER  -

Lawrence, N.D. & Quiñonero Candela, J.. (2006). Local Distance Preservation in the GP-LVM through Back Constraints. Proceedings of the International Conference in Machine Learning 23:513-520