Fast variational inference in the Conjugate Exponential family

[edit]

Zhenwen Dai, Amazon Research Cambridge
Mauricio A. Álvarez, University of Sheffield
Neil D. Lawrence, Amazon Research Cambridge and University of Sheffield

in Advances in Neural Information Processing Systems 30

Abstract

Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) and that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP, of which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data.


@InProceedings{efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes,
  title = 	 {Fast variational inference in the Conjugate Exponential family},
  author = 	 {Zhenwen Dai and Mauricio A. Álvarez and Neil D. Lawrence},
  booktitle = 	 {Advances in Neural Information Processing Systems},
  year = 	 {2017},
  volume = 	 {30},
  address = 	 {Cambridge, MA},
  month = 	 {00},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2017-12-05-efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes.md},
  url =  	 {http://inverseprobability.com/publications/efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes.html},
  abstract = 	 {Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) and that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP, of which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data.},
  key = 	 {Dai:supervised17},
  OPTgroup = 	 {}

}
%T Fast variational inference in the Conjugate Exponential family
%A Zhenwen Dai and Mauricio A. Álvarez and Neil D. Lawrence
%B 
%C Advances in Neural Information Processing Systems
%D 
%F efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes	
%P --
%R 
%U http://inverseprobability.com/publications/efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes.html
%V 30
%X Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) and that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP, of which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data.
TY  - CPAPER
TI  - Fast variational inference in the Conjugate Exponential family
AU  - Zhenwen Dai
AU  - Mauricio A. Álvarez
AU  - Neil D. Lawrence
BT  - Advances in Neural Information Processing Systems
PY  - 2017/12/05
DA  - 2017/12/05	
ID  - efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes	
SP  - 
EP  - 
UR  - http://inverseprobability.com/publications/efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes.html
AB  - Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) and that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP, of which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data.
ER  -

Dai, Z., Álvarez, M.A. & Lawrence, N.D.. (2017). Fast variational inference in the Conjugate Exponential family. Advances in Neural Information Processing Systems 30:-