Variational Gaussian Process Dynamical Systems

[edit]

Andreas Damianou, University of Sheffield
Michalis K. Titsias, University of Athens
Neil D. Lawrence, University of Sheffield

in Advances in Neural Information Processing Systems 24

Related Material

Abstract

High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space. The approach also allows for the appropriate dimensionality of the latent space to be automatically determined. We demonstrate the model on a human motion capture data set and a series of high resolution video sequences.


@InProceedings{damianou-vgpds11,
  title = 	 {Variational Gaussian Process Dynamical Systems},
  author = 	 {Andreas Damianou and Michalis K. Titsias and Neil D. Lawrence},
  booktitle = 	 {Advances in Neural Information Processing Systems},
  year = 	 {2011},
  editor = 	 {Peter Bartlett and Fernando Peirrera and Christopher K. I. Williams and John Lafferty},
  volume = 	 {24},
  address = 	 {Cambridge, MA},
  month = 	 {00},
  publisher = 	 {MIT Press},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2011-01-01-damianou-vgpds11.md},
  url =  	 {http://inverseprobability.com/publications/damianou-vgpds11.html},
  abstract = 	 {High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space. The approach also allows for the appropriate dimensionality of the latent space to be automatically determined. We demonstrate the model on a human motion capture data set and a series of high resolution video sequences.},
  crossref =  {Bartlett:nips11},
  key = 	 {Damianou:vgpds11},
  linkpdf = 	 {ftp://ftp.dcs.shef.ac.uk/home/neil/VGPDS_Nips11.pdf},
  linksoftware = {https://github.com/SheffieldML/vargplvm},
  OPTgroup = 	 {}
 

}
%T Variational Gaussian Process Dynamical Systems
%A Andreas Damianou and Michalis K. Titsias and Neil D. Lawrence
%B 
%C Advances in Neural Information Processing Systems
%D 
%E Peter Bartlett and Fernando Peirrera and Christopher K. I. Williams and John Lafferty
%F damianou-vgpds11
%I MIT Press	
%P --
%R 
%U http://inverseprobability.com/publications/damianou-vgpds11.html
%V 24
%X High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space. The approach also allows for the appropriate dimensionality of the latent space to be automatically determined. We demonstrate the model on a human motion capture data set and a series of high resolution video sequences.
TY  - CPAPER
TI  - Variational Gaussian Process Dynamical Systems
AU  - Andreas Damianou
AU  - Michalis K. Titsias
AU  - Neil D. Lawrence
BT  - Advances in Neural Information Processing Systems
PY  - 2011/01/01
DA  - 2011/01/01
ED  - Peter Bartlett
ED  - Fernando Peirrera
ED  - Christopher K. I. Williams
ED  - John Lafferty	
ID  - damianou-vgpds11
PB  - MIT Press	
SP  - 
EP  - 
L1  - ftp://ftp.dcs.shef.ac.uk/home/neil/VGPDS_Nips11.pdf
UR  - http://inverseprobability.com/publications/damianou-vgpds11.html
AB  - High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space. The approach also allows for the appropriate dimensionality of the latent space to be automatically determined. We demonstrate the model on a human motion capture data set and a series of high resolution video sequences.
ER  -

Damianou, A., Titsias, M.K. & Lawrence, N.D.. (2011). Variational Gaussian Process Dynamical Systems. Advances in Neural Information Processing Systems 24:-