Approximating Posterior Distributions in Belief Networks using Mixtures

Christopher M. BishopNeil D. LawrenceTommi S. JaakkolaMichael I. Jordan
Advances in Neural Information Processing Systems, MIT Press 10:416-422, 1998.

Abstract

Exact inference in densely connected Bayesian networks is computationally intractable, and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on *mixtures* of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased.

Cite this Paper


BibTeX
@InProceedings{Bishop:mixtures97, title = {Approximating Posterior Distributions in Belief Networks using Mixtures}, author = {Bishop, Christopher M. and Lawrence, Neil D. and Jaakkola, Tommi S. and Jordan, Michael I.}, booktitle = {Advances in Neural Information Processing Systems}, pages = {416--422}, year = {1998}, editor = {Jordan, Michael I. and Kearns, Michael J. and Solla, Sara A.}, volume = {10}, address = {Cambridge, MA}, publisher = {MIT Press}, pdf = {https://inverseprobability.com/publications/files/mixtures.pdf}, url = {http://inverseprobability.com/publications/bishop-mixtures97.html}, abstract = {Exact inference in densely connected Bayesian networks is computationally intractable, and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on *mixtures* of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased. } }
Endnote
%0 Conference Paper %T Approximating Posterior Distributions in Belief Networks using Mixtures %A Christopher M. Bishop %A Neil D. Lawrence %A Tommi S. Jaakkola %A Michael I. Jordan %B Advances in Neural Information Processing Systems %D 1998 %E Michael I. Jordan %E Michael J. Kearns %E Sara A. Solla %F Bishop:mixtures97 %I MIT Press %P 416--422 %U http://inverseprobability.com/publications/bishop-mixtures97.html %V 10 %X Exact inference in densely connected Bayesian networks is computationally intractable, and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on *mixtures* of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased.
RIS
TY - CPAPER TI - Approximating Posterior Distributions in Belief Networks using Mixtures AU - Christopher M. Bishop AU - Neil D. Lawrence AU - Tommi S. Jaakkola AU - Michael I. Jordan BT - Advances in Neural Information Processing Systems DA - 1998/01/01 ED - Michael I. Jordan ED - Michael J. Kearns ED - Sara A. Solla ID - Bishop:mixtures97 PB - MIT Press VL - 10 SP - 416 EP - 422 L1 - https://inverseprobability.com/publications/files/mixtures.pdf UR - http://inverseprobability.com/publications/bishop-mixtures97.html AB - Exact inference in densely connected Bayesian networks is computationally intractable, and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on *mixtures* of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased. ER -
APA
Bishop, C.M., Lawrence, N.D., Jaakkola, T.S. & Jordan, M.I.. (1998). Approximating Posterior Distributions in Belief Networks using Mixtures. Advances in Neural Information Processing Systems 10:416-422 Available from http://inverseprobability.com/publications/bishop-mixtures97.html.

Related Material