Manifold Relevance Determination

[edit]

Andreas Damianou, University of Sheffield
Carl Henrik Ek, University of Bristol
Michalis K. Titsias, University of Athens
Neil D. Lawrence, University of Sheffield

in Proceedings of the International Conference in Machine Learning 29

Related Material

Abstract

In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a “softly” shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data.


@InProceedings{damianou-manifold12,
  title = 	 {Manifold Relevance Determination},
  author = 	 {Andreas Damianou and Carl Henrik Ek and Michalis K. Titsias and Neil D. Lawrence},
  booktitle = 	 {Proceedings of the International Conference in Machine Learning},
  year = 	 {2012},
  editor = 	 {John Langford and Joelle Pineau},
  volume = 	 {29},
  address = 	 {San Francisco, CA},
  month = 	 {00},
  publisher = 	 {Morgan Kauffman},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2012-01-01-damianou-manifold12.md},
  url =  	 {http://inverseprobability.com/publications/damianou-manifold12.html},
  abstract = 	 {In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a “softly” shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data.},
  crossref =  {Langford:icml12},
  key = 	 {Damianou:manifold12},
  linkpdf = 	 {ftp://ftp.dcs.shef.ac.uk/home/neil/mrdICML2012.pdf},
  group = 	 {}
 

}
%T Manifold Relevance Determination
%A Andreas Damianou and Carl Henrik Ek and Michalis K. Titsias and Neil D. Lawrence
%B 
%C Proceedings of the International Conference in Machine Learning
%D 
%E John Langford and Joelle Pineau
%F damianou-manifold12
%I Morgan Kauffman	
%P --
%R 
%U http://inverseprobability.com/publications/damianou-manifold12.html
%V 29
%X In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a “softly” shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data.
TY  - CPAPER
TI  - Manifold Relevance Determination
AU  - Andreas Damianou
AU  - Carl Henrik Ek
AU  - Michalis K. Titsias
AU  - Neil D. Lawrence
BT  - Proceedings of the International Conference in Machine Learning
PY  - 2012/01/01
DA  - 2012/01/01
ED  - John Langford
ED  - Joelle Pineau	
ID  - damianou-manifold12
PB  - Morgan Kauffman	
SP  - 
EP  - 
L1  - ftp://ftp.dcs.shef.ac.uk/home/neil/mrdICML2012.pdf
UR  - http://inverseprobability.com/publications/damianou-manifold12.html
AB  - In this paper we present a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation. The latent space is factorized to represent shared and private information from multiple views of the data. In contrast to previous approaches, we introduce a relaxation to the discrete segmentation and allow for a “softly” shared latent space. Further, Bayesian techniques allow us to automatically estimate the dimensionality of the latent spaces. The model is capable of capturing structure underlying extremely high dimensional spaces. This is illustrated by modelling unprocessed images with tenths of thousands of pixels. This also allows us to directly generate novel images from the trained model by sampling from the discovered latent spaces. We also demonstrate the model by prediction of human pose in an ambiguous setting. Our Bayesian framework allows us to perform disambiguation in a principled manner by including latent space priors which incorporate the dynamic nature of the data.
ER  -

Damianou, A., Ek, C.H., Titsias, M.K. & Lawrence, N.D.. (2012). Manifold Relevance Determination. Proceedings of the International Conference in Machine Learning 29:-