Variational Gaussian Process Dynamical Systems

Andreas DamianouMichalis K. TitsiasNeil D. Lawrence
Advances in Neural Information Processing Systems, MIT Press 24, 2011.

Abstract

High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space. The approach also allows for the appropriate dimensionality of the latent space to be automatically determined. We demonstrate the model on a human motion capture data set and a series of high resolution video sequences.

Cite this Paper


BibTeX
@InProceedings{Damianou:vgpds11, title = {Variational {G}aussian Process Dynamical Systems}, author = {Damianou, Andreas and Titsias, Michalis K. and Lawrence, Neil D.}, booktitle = {Advances in Neural Information Processing Systems}, year = {2011}, editor = {Bartlett, Peter and Peirrera, Fernando and Williams, Christopher K. I. and Lafferty, John}, volume = {24}, address = {Cambridge, MA}, publisher = {MIT Press}, pdf = {https://proceedings.neurips.cc/paper/2011/file/af4732711661056eadbf798ba191272a-Paper.pdf}, url = {http://inverseprobability.com/publications/damianou-vgpds11.html}, abstract = {High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space. The approach also allows for the appropriate dimensionality of the latent space to be automatically determined. We demonstrate the model on a human motion capture data set and a series of high resolution video sequences.} }
Endnote
%0 Conference Paper %T Variational Gaussian Process Dynamical Systems %A Andreas Damianou %A Michalis K. Titsias %A Neil D. Lawrence %B Advances in Neural Information Processing Systems %D 2011 %E Peter Bartlett %E Fernando Peirrera %E Christopher K. I. Williams %E John Lafferty %F Damianou:vgpds11 %I MIT Press %U http://inverseprobability.com/publications/damianou-vgpds11.html %V 24 %X High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space. The approach also allows for the appropriate dimensionality of the latent space to be automatically determined. We demonstrate the model on a human motion capture data set and a series of high resolution video sequences.
RIS
TY - CPAPER TI - Variational Gaussian Process Dynamical Systems AU - Andreas Damianou AU - Michalis K. Titsias AU - Neil D. Lawrence BT - Advances in Neural Information Processing Systems DA - 2011/01/01 ED - Peter Bartlett ED - Fernando Peirrera ED - Christopher K. I. Williams ED - John Lafferty ID - Damianou:vgpds11 PB - MIT Press VL - 24 L1 - https://proceedings.neurips.cc/paper/2011/file/af4732711661056eadbf798ba191272a-Paper.pdf UR - http://inverseprobability.com/publications/damianou-vgpds11.html AB - High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space. The approach also allows for the appropriate dimensionality of the latent space to be automatically determined. We demonstrate the model on a human motion capture data set and a series of high resolution video sequences. ER -
APA
Damianou, A., Titsias, M.K. & Lawrence, N.D.. (2011). Variational Gaussian Process Dynamical Systems. Advances in Neural Information Processing Systems 24 Available from http://inverseprobability.com/publications/damianou-vgpds11.html.

Related Material