[edit]
Tilted Variational Bayes
Proceedings of the Seventeenth International Workshop on Artificial Intelligence and Statistics, PMLR 33:356-364, 2014.
Abstract
We present a novel method for approximate inference. Using some of the
constructs from expectation propagation (EP), we derive a lower bound of the marginal
likelihood in a similar fashion to variational Bayes (VB). The method combines some
of the benefits of VB and EP: it can be used with light-tailed likelihoods (where
traditional VB fails), and it provides a lower bound on the marginal likelihood.
We apply the method to Gaussian process classification, a situation where the Kullback-Leibler
divergence minimized in traditional VB can be infinite, and to robust Gaussian process
regression, where the inference process is dramatically simplified in comparison
to EP.
Code to reproduce all the experiments can be found at .