Metrics for Probabilistic Geometries

[edit]

Alessandra Tosi, University of Oxford
Søren Hauberg
Alfredo Vellido
Neil D. Lawrence, University of Sheffield

in Uncertainty in Artificial Intelligence 30, pp 800-808

Related Material

Abstract

We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances that respect the expected metric lead to more appropriate generation of new data.


@InProceedings{tosi-metrics14,
  title = 	 {Metrics for Probabilistic Geometries},
  author = 	 {Alessandra Tosi and Søren Hauberg and Alfredo Vellido and Neil D. Lawrence},
  booktitle = 	 {Uncertainty in Artificial Intelligence},
  pages = 	 {800},
  year = 	 {2014},
  editor = 	 {Nevin Zhang and Jin Tian},
  volume = 	 {30},
  month = 	 {00},
  publisher = 	 {AUAI Press},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2014-07-23-tosi-metrics14.md},
  url =  	 {http://inverseprobability.com/publications/tosi-metrics14.html},
  abstract = 	 {We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances that respect the expected metric lead to more appropriate generation of new data.},
  crossref =  {Zhang:uai14},
  key = 	 {Tosi:metrics14},
  linkpdf = 	 {http://auai.org/uai2014/proceedings/individuals/171.pdf},
  OPTgroup = 	 {}
 

}
%T Metrics for Probabilistic Geometries
%A Alessandra Tosi and Søren Hauberg and Alfredo Vellido and Neil D. Lawrence
%B 
%C Uncertainty in Artificial Intelligence
%D 
%E Nevin Zhang and Jin Tian
%F tosi-metrics14
%I AUAI Press	
%P 800--808
%R 
%U http://inverseprobability.com/publications/tosi-metrics14.html
%V 30
%X We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances that respect the expected metric lead to more appropriate generation of new data.
TY  - CPAPER
TI  - Metrics for Probabilistic Geometries
AU  - Alessandra Tosi
AU  - Søren Hauberg
AU  - Alfredo Vellido
AU  - Neil D. Lawrence
BT  - Uncertainty in Artificial Intelligence
PY  - 2014/07/23
DA  - 2014/07/23
ED  - Nevin Zhang
ED  - Jin Tian	
ID  - tosi-metrics14
PB  - AUAI Press	
SP  - 800
EP  - 808
L1  - http://auai.org/uai2014/proceedings/individuals/171.pdf
UR  - http://inverseprobability.com/publications/tosi-metrics14.html
AB  - We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances that respect the expected metric lead to more appropriate generation of new data.
ER  -

Tosi, A., Hauberg, S., Vellido, A. & Lawrence, N.D.. (2014). Metrics for Probabilistic Geometries. Uncertainty in Artificial Intelligence 30:800-808