Topologically-Constrained Latent Variable Models

Raquel Urtasun, David J. Fleet, Andreas Geiger, Jovan Popović, Trevor J. Darrell, Neil D. Lawrence
Proceedings of the International Conference in Machine Learning, Omnipress 25:1080-1087, 2008.

Abstract

In dimensionality reduction approaches, the data are typically embedded in a Euclidean latent space. However for some data sets this is inappropriate. For example, in human motion data we expect latent spaces that are cylindrical or a toroidal, that are poorly captured with a Euclidean space. In this paper, we present a range of approaches for embedding data in a non-Euclidean latent space. Our focus is the Gaussian Process latent variable model. In the context of human motion modeling this allows us to (a) learn models with interpretable latent directions enabling, for example, style/content separation, and (b) generalise beyond the data set enabling us to learn transitions between motion styles even though such transitions are not present in the data.

Cite this Paper


BibTeX
@InProceedings{Urtasun:topology08, title = {Topologically-Constrained Latent Variable Models}, author = {Urtasun, Raquel and Fleet, David J. and Geiger, Andreas and Popović, Jovan and Darrell, Trevor J. and Lawrence, Neil D.}, booktitle = {Proceedings of the International Conference in Machine Learning}, pages = {1080--1087}, year = {2008}, editor = {Roweis, Sam and McCallum, Andrew}, volume = {25}, publisher = {Omnipress}, doi = {10.1145/1390156.1390292}, pdf = {https://lawrennd.github.io/publications/files/topology.pdf}, url = {http://inverseprobability.com/publications/urtasun-topology08.html}, abstract = {In dimensionality reduction approaches, the data are typically embedded in a Euclidean latent space. However for some data sets this is inappropriate. For example, in human motion data we expect latent spaces that are cylindrical or a toroidal, that are poorly captured with a Euclidean space. In this paper, we present a range of approaches for embedding data in a non-Euclidean latent space. Our focus is the Gaussian Process latent variable model. In the context of human motion modeling this allows us to (a) learn models with interpretable latent directions enabling, for example, style/content separation, and (b) generalise beyond the data set enabling us to learn transitions between motion styles even though such transitions are not present in the data.} }
Endnote
%0 Conference Paper %T Topologically-Constrained Latent Variable Models %A Raquel Urtasun %A David J. Fleet %A Andreas Geiger %A Jovan Popović %A Trevor J. Darrell %A Neil D. Lawrence %B Proceedings of the International Conference in Machine Learning %D 2008 %E Sam Roweis %E Andrew McCallum %F Urtasun:topology08 %I Omnipress %P 1080--1087 %R 10.1145/1390156.1390292 %U http://inverseprobability.com/publications/urtasun-topology08.html %V 25 %X In dimensionality reduction approaches, the data are typically embedded in a Euclidean latent space. However for some data sets this is inappropriate. For example, in human motion data we expect latent spaces that are cylindrical or a toroidal, that are poorly captured with a Euclidean space. In this paper, we present a range of approaches for embedding data in a non-Euclidean latent space. Our focus is the Gaussian Process latent variable model. In the context of human motion modeling this allows us to (a) learn models with interpretable latent directions enabling, for example, style/content separation, and (b) generalise beyond the data set enabling us to learn transitions between motion styles even though such transitions are not present in the data.
RIS
TY - CPAPER TI - Topologically-Constrained Latent Variable Models AU - Raquel Urtasun AU - David J. Fleet AU - Andreas Geiger AU - Jovan Popović AU - Trevor J. Darrell AU - Neil D. Lawrence BT - Proceedings of the International Conference in Machine Learning DA - 2008/07/05 ED - Sam Roweis ED - Andrew McCallum ID - Urtasun:topology08 PB - Omnipress VL - 25 SP - 1080 EP - 1087 DO - 10.1145/1390156.1390292 L1 - https://lawrennd.github.io/publications/files/topology.pdf UR - http://inverseprobability.com/publications/urtasun-topology08.html AB - In dimensionality reduction approaches, the data are typically embedded in a Euclidean latent space. However for some data sets this is inappropriate. For example, in human motion data we expect latent spaces that are cylindrical or a toroidal, that are poorly captured with a Euclidean space. In this paper, we present a range of approaches for embedding data in a non-Euclidean latent space. Our focus is the Gaussian Process latent variable model. In the context of human motion modeling this allows us to (a) learn models with interpretable latent directions enabling, for example, style/content separation, and (b) generalise beyond the data set enabling us to learn transitions between motion styles even though such transitions are not present in the data. ER -
APA
Urtasun, R., Fleet, D.J., Geiger, A., Popović, J., Darrell, T.J. & Lawrence, N.D.. (2008). Topologically-Constrained Latent Variable Models. Proceedings of the International Conference in Machine Learning 25:1080-1087 doi:10.1145/1390156.1390292 Available from http://inverseprobability.com/publications/urtasun-topology08.html.

Related Material