Tilted Variational Bayes

James HensmanMax ZwiesseleNeil D. Lawrence
Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics, PMLR 33:356-364, 2014.

Abstract

We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at .

Cite this Paper


BibTeX
@InProceedings{Hensman-tvb14, title = {Tilted Variational {B}ayes}, author = {Hensman, James and Zwiessele, Max and Lawrence, Neil D.}, booktitle = {Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics}, pages = {356--364}, year = {2014}, editor = {Kaski, Sami and Corander, Jukka}, volume = {33}, series = {PMLR}, address = {Rekyavik, Iceland}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/hensman14.pdf}, url = {http://inverseprobability.com/publications/tilted-variational-bayes.html}, abstract = {We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at . } }
Endnote
%0 Conference Paper %T Tilted Variational Bayes %A James Hensman %A Max Zwiessele %A Neil D. Lawrence %B Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics %C PMLR %D 2014 %E Sami Kaski %E Jukka Corander %F Hensman-tvb14 %I PMLR %P 356--364 %U http://inverseprobability.com/publications/tilted-variational-bayes.html %V 33 %X We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at .
RIS
TY - CPAPER TI - Tilted Variational Bayes AU - James Hensman AU - Max Zwiessele AU - Neil D. Lawrence BT - Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Sami Kaski ED - Jukka Corander ID - Hensman-tvb14 PB - PMLR DP - PMLR VL - 33 SP - 356 EP - 364 L1 - http://proceedings.mlr.press/v33/hensman14.pdf UR - http://inverseprobability.com/publications/tilted-variational-bayes.html AB - We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at . ER -
APA
Hensman, J., Zwiessele, M. & Lawrence, N.D.. (2014). Tilted Variational Bayes. Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics, in PMLR 33:356-364 Available from http://inverseprobability.com/publications/tilted-variational-bayes.html.

Related Material