Manifold Relevance Determination

Andreas DamianouCarl Henrik EkMichalis K. TitsiasNeil D. Lawrence
Proceedings of the International Conference in Machine Learning, Morgan Kauffman 29, 2012.

Abstract

In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a “softly” shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data.

Cite this Paper


BibTeX
@InProceedings{Damianou-manifold12, title = {Manifold Relevance Determination}, author = {Damianou, Andreas and Ek, Carl Henrik and Titsias, Michalis K. and Lawrence, Neil D.}, booktitle = {Proceedings of the International Conference in Machine Learning}, year = {2012}, editor = {Langford, John and Pineau, Joelle}, volume = {29}, address = {San Francisco, CA}, publisher = {Morgan Kauffman}, pdf = {https://icml.cc/2012/papers/94.pdf}, url = {http://inverseprobability.com/publications/damianou-manifold12.html}, abstract = {In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a “softly” shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data.} }
Endnote
%0 Conference Paper %T Manifold Relevance Determination %A Andreas Damianou %A Carl Henrik Ek %A Michalis K. Titsias %A Neil D. Lawrence %B Proceedings of the International Conference in Machine Learning %D 2012 %E John Langford %E Joelle Pineau %F Damianou-manifold12 %I Morgan Kauffman %U http://inverseprobability.com/publications/damianou-manifold12.html %V 29 %X In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a “softly” shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data.
RIS
TY - CPAPER TI - Manifold Relevance Determination AU - Andreas Damianou AU - Carl Henrik Ek AU - Michalis K. Titsias AU - Neil D. Lawrence BT - Proceedings of the International Conference in Machine Learning DA - 2012/06/26 ED - John Langford ED - Joelle Pineau ID - Damianou-manifold12 PB - Morgan Kauffman VL - 29 L1 - https://icml.cc/2012/papers/94.pdf UR - http://inverseprobability.com/publications/damianou-manifold12.html AB - In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a “softly” shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data. ER -
APA
Damianou, A., Ek, C.H., Titsias, M.K. & Lawrence, N.D.. (2012). Manifold Relevance Determination. Proceedings of the International Conference in Machine Learning 29 Available from http://inverseprobability.com/publications/damianou-manifold12.html.

Related Material