Fast variational inference in the Conjugate Exponential family

James HensmanMagnus RattrayNeil D. Lawrence
Advances in Neural Information Processing Systems, 25, 2012.

Abstract

We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimization methods based on conjugate gradients for these models. Our approach is very general and is easily applied to any model where the mean field update equations have been derived. Empirically we show significant speed-ups for probabilistic models optimized using our bound.

Cite this Paper


BibTeX
@InProceedings{Hensman:fast12, title = {Fast variational inference in the Conjugate Exponential family}, author = {Hensman, James and Rattray, Magnus and Lawrence, Neil D.}, booktitle = {Advances in Neural Information Processing Systems}, year = {2012}, editor = {Bartlett, Peter L. and Pereira, Fernando C. N. and Burges, Christopher J. C. and Bottou, Léon and Weinberger, Kilian Q.}, volume = {25}, address = {Cambridge, MA}, pdf = {https://proceedings.neurips.cc/paper/2012/file/50905d7b2216bfeccb5b41016357176b-Paper.pdf}, url = {http://inverseprobability.com/publications/hensman-fast12.html}, abstract = {We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimization methods based on conjugate gradients for these models. Our approach is very general and is easily applied to any model where the mean field update equations have been derived. Empirically we show significant speed-ups for probabilistic models optimized using our bound.} }
Endnote
%0 Conference Paper %T Fast variational inference in the Conjugate Exponential family %A James Hensman %A Magnus Rattray %A Neil D. Lawrence %B Advances in Neural Information Processing Systems %D 2012 %E Peter L. Bartlett %E Fernando C. N. Pereira %E Christopher J. C. Burges %E Léon Bottou %E Kilian Q. Weinberger %F Hensman:fast12 %U http://inverseprobability.com/publications/hensman-fast12.html %V 25 %X We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimization methods based on conjugate gradients for these models. Our approach is very general and is easily applied to any model where the mean field update equations have been derived. Empirically we show significant speed-ups for probabilistic models optimized using our bound.
RIS
TY - CPAPER TI - Fast variational inference in the Conjugate Exponential family AU - James Hensman AU - Magnus Rattray AU - Neil D. Lawrence BT - Advances in Neural Information Processing Systems DA - 2012/12/04 ED - Peter L. Bartlett ED - Fernando C. N. Pereira ED - Christopher J. C. Burges ED - Léon Bottou ED - Kilian Q. Weinberger ID - Hensman:fast12 VL - 25 L1 - https://proceedings.neurips.cc/paper/2012/file/50905d7b2216bfeccb5b41016357176b-Paper.pdf UR - http://inverseprobability.com/publications/hensman-fast12.html AB - We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimization methods based on conjugate gradients for these models. Our approach is very general and is easily applied to any model where the mean field update equations have been derived. Empirically we show significant speed-ups for probabilistic models optimized using our bound. ER -
APA
Hensman, J., Rattray, M. & Lawrence, N.D.. (2012). Fast variational inference in the Conjugate Exponential family. Advances in Neural Information Processing Systems 25 Available from http://inverseprobability.com/publications/hensman-fast12.html.

Related Material