Hierarchical Gaussian Process Latent Variable Models

Neil D. Lawrence, Andrew J. Moore
Proceedings of the International Conference in Machine Learning, Omnipress 24:481-488, 2007.

Abstract

The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets.

Cite this Paper


BibTeX
@InProceedings{Lawrence:hgplvm07, title = {Hierarchical {G}aussian Process Latent Variable Models}, author = {Lawrence, Neil D. and Moore, Andrew J.}, booktitle = {Proceedings of the International Conference in Machine Learning}, pages = {481--488}, year = {2007}, editor = {Ghahramani, Zoubin}, volume = {24}, publisher = {Omnipress}, pdf = {https://icml.cc/imls/conferences/2007/proceedings/papers/408.pdf}, url = {http://inverseprobability.com/publications/lawrence-hgplvm07.html}, abstract = {The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets.} }
Endnote
%0 Conference Paper %T Hierarchical Gaussian Process Latent Variable Models %A Neil D. Lawrence %A Andrew J. Moore %B Proceedings of the International Conference in Machine Learning %D 2007 %E Zoubin Ghahramani %F Lawrence:hgplvm07 %I Omnipress %P 481--488 %U http://inverseprobability.com/publications/lawrence-hgplvm07.html %V 24 %X The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets.
RIS
TY - CPAPER TI - Hierarchical Gaussian Process Latent Variable Models AU - Neil D. Lawrence AU - Andrew J. Moore BT - Proceedings of the International Conference in Machine Learning DA - 2007/01/01 ED - Zoubin Ghahramani ID - Lawrence:hgplvm07 PB - Omnipress VL - 24 SP - 481 EP - 488 L1 - https://icml.cc/imls/conferences/2007/proceedings/papers/408.pdf UR - http://inverseprobability.com/publications/lawrence-hgplvm07.html AB - The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets. ER -
APA
Lawrence, N.D. & Moore, A.J.. (2007). Hierarchical Gaussian Process Latent Variable Models. Proceedings of the International Conference in Machine Learning 24:481-488 Available from http://inverseprobability.com/publications/lawrence-hgplvm07.html.

Related Material