Tilted Variational Bayes

[edit]

James Hensman, University of Lancaster
Max Zwiessele, University of Sheffield
Neil D. Lawrence, University of Sheffield

in Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics 33, pp 356-364

Related Material

Abstract

We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP.\ \ Code to reproduce all the experiments can be found at http://github.com/SheffieldML/TVB.


@InProceedings{hensman-tvb14,
  title = 	 {Tilted Variational Bayes},
  author = 	 {James Hensman and Max Zwiessele and Neil D. Lawrence},
  booktitle = 	 {Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics},
  pages = 	 {356},
  year = 	 {2014},
  editor = 	 {Sami Kaski and Jukka Corander},
  volume = 	 {33},
  address = 	 {Iceland},
  month = 	 {00},
  publisher = 	 {JMLR W\&CP 33},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2014-01-01-hensman-tvb14.md},
  url =  	 {http://inverseprobability.com/publications/hensman-tvb14.html},
  abstract = 	 {We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP.\
\
Code to reproduce all the experiments can be found at .},
  crossref =  {Kaski:aistats14},
  key = 	 {Hensman:tvb14},
  linkpdf = 	 {http://jmlr.org/proceedings/papers/v33/hensman14.pdf},
  linksoftware = {https://github.com/SheffieldML/GPy},
  OPTgroup = 	 {}
 

}
%T Tilted Variational Bayes
%A James Hensman and Max Zwiessele and Neil D. Lawrence
%B 
%C Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics
%D 
%E Sami Kaski and Jukka Corander
%F hensman-tvb14
%I JMLR W\&CP 33	
%P 356--364
%R 
%U http://inverseprobability.com/publications/hensman-tvb14.html
%V 33
%X We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP.\
\
Code to reproduce all the experiments can be found at .
TY  - CPAPER
TI  - Tilted Variational Bayes
AU  - James Hensman
AU  - Max Zwiessele
AU  - Neil D. Lawrence
BT  - Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics
PY  - 2014/01/01
DA  - 2014/01/01
ED  - Sami Kaski
ED  - Jukka Corander	
ID  - hensman-tvb14
PB  - JMLR W\&CP 33	
SP  - 356
EP  - 364
L1  - http://jmlr.org/proceedings/papers/v33/hensman14.pdf
UR  - http://inverseprobability.com/publications/hensman-tvb14.html
AB  - We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP.\
\
Code to reproduce all the experiments can be found at .
ER  -

Hensman, J., Zwiessele, M. & Lawrence, N.D.. (2014). Tilted Variational Bayes. Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics 33:356-364