A Variational Bayesian Committee of Neural Networks

Neil D. Lawrence, Mehdi Azzouzi
, 1999.

Abstract

Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-lawrence-nnmixtures99, title = {A Variational Bayesian Committee of Neural Networks}, author = {Neil D. Lawrence and Mehdi Azzouzi}, year = {}, editor = {}, url = {http://inverseprobability.com/publications/lawrence-nnmixtures99.html}, abstract = {Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets. } }
Endnote
%0 Conference Paper %T A Variational Bayesian Committee of Neural Networks %A Neil D. Lawrence %A Mehdi Azzouzi %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-lawrence-nnmixtures99 %I PMLR %J Proceedings of Machine Learning Research %P -- %U http://inverseprobability.com %V %W PMLR %X Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.
RIS
TY - CPAPER TI - A Variational Bayesian Committee of Neural Networks AU - Neil D. Lawrence AU - Mehdi Azzouzi BT - PY - DA - ED - ID - pmlr-v-lawrence-nnmixtures99 PB - PMLR SP - DP - PMLR EP - L1 - UR - http://inverseprobability.com/publications/lawrence-nnmixtures99.html AB - Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets. ER -
APA
Lawrence, N.D. & Azzouzi, M.. (). A Variational Bayesian Committee of Neural Networks. , in PMLR :-

Related Material