Hierarchical Gaussian Process Latent Variable Models

[edit]

Neil D. Lawrence, University of Sheffield
Andrew J. Moore

in Proceedings of the International Conference in Machine Learning 24, pp 481-488

Related Material

Abstract

The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets.


@InProceedings{lawrence-hgplvm07,
  title = 	 {Hierarchical Gaussian Process Latent Variable Models},
  author = 	 {Neil D. Lawrence and Andrew J. Moore},
  booktitle = 	 {Proceedings of the International Conference in Machine Learning},
  pages = 	 {481},
  year = 	 {2007},
  editor = 	 {Zoubin Ghahramani},
  volume = 	 {24},
  month = 	 {00},
  publisher = 	 {Omnipress},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2007-01-01-lawrence-hgplvm07.md},
  url =  	 {http://inverseprobability.com/publications/lawrence-hgplvm07.html},
  abstract = 	 {The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets.},
  crossref =  {Ghahramani:icml07},
  key = 	 {Lawrence:hgplvm07},
  linkpdf = 	 {ftp://ftp.dcs.shef.ac.uk/home/neil/hgplvm.pdf},
  linksoftware = {https://github.com/SheffieldML/hgplvm/},
  group = 	 {manml,gp,gplvm,dimensional reduction}
 

}
%T Hierarchical Gaussian Process Latent Variable Models
%A Neil D. Lawrence and Andrew J. Moore
%B 
%C Proceedings of the International Conference in Machine Learning
%D 
%E Zoubin Ghahramani
%F lawrence-hgplvm07
%I Omnipress	
%P 481--488
%R 
%U http://inverseprobability.com/publications/lawrence-hgplvm07.html
%V 24
%X The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets.
TY  - CPAPER
TI  - Hierarchical Gaussian Process Latent Variable Models
AU  - Neil D. Lawrence
AU  - Andrew J. Moore
BT  - Proceedings of the International Conference in Machine Learning
PY  - 2007/01/01
DA  - 2007/01/01
ED  - Zoubin Ghahramani	
ID  - lawrence-hgplvm07
PB  - Omnipress	
SP  - 481
EP  - 488
L1  - ftp://ftp.dcs.shef.ac.uk/home/neil/hgplvm.pdf
UR  - http://inverseprobability.com/publications/lawrence-hgplvm07.html
AB  - The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets.
ER  -

Lawrence, N.D. & Moore, A.J.. (2007). Hierarchical Gaussian Process Latent Variable Models. Proceedings of the International Conference in Machine Learning 24:481-488