Tilted Variational Bayes

James HensmanMax ZwiesseleNeil D. Lawrence
,  33:356-364, 2014.

Abstract

We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP.\ \ Code to reproduce all the experiments can be found at .

Cite this Paper


BibTeX
@InProceedings{pmlr-v-hensman-tvb14, title = {Tilted Variational Bayes}, author = {James Hensman and Max Zwiessele and Neil D. Lawrence}, pages = {356--364}, year = {}, editor = {}, volume = {33}, address = {Iceland}, url = {http://inverseprobability.com/publications/hensman-tvb14.html}, abstract = {We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP.\ \ Code to reproduce all the experiments can be found at .} }
Endnote
%0 Conference Paper %T Tilted Variational Bayes %A James Hensman %A Max Zwiessele %A Neil D. Lawrence %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-hensman-tvb14 %I PMLR %J Proceedings of Machine Learning Research %P 356--364 %U http://inverseprobability.com %V %W PMLR %X We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP.\ \ Code to reproduce all the experiments can be found at .
RIS
TY - CPAPER TI - Tilted Variational Bayes AU - James Hensman AU - Max Zwiessele AU - Neil D. Lawrence BT - PY - DA - ED - ID - pmlr-v-hensman-tvb14 PB - PMLR SP - 356 DP - PMLR EP - 364 L1 - UR - http://inverseprobability.com/publications/hensman-tvb14.html AB - We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP.\ \ Code to reproduce all the experiments can be found at . ER -
APA
Hensman, J., Zwiessele, M. & Lawrence, N.D.. (). Tilted Variational Bayes. , in PMLR :356-364

Related Material