Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on mixtures of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.
@TechReport{lawrence-nnmixtures99,
title = {A Variational Bayesian Committee of Neural Networks},
author = {Neil D. Lawrence and Mehdi Azzouzi},
year = {1999},
month = {00},
edit = {https://github.com/lawrennd//publications/edit/gh-pages/_posts/1999-01-01-lawrence-nnmixtures99.md},
url = {http://inverseprobability.com/publications/lawrence-nnmixtures99.html},
abstract = {Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.},
key = {Lawrence:nnmixtures99},
linkpsgz = {http://www.thelawrences.net/neil/nnmixture.ps.gz},
OPTgroup = {}
}
%T A Variational Bayesian Committee of Neural Networks
%A Neil D. Lawrence and Mehdi Azzouzi
%B
%D
%F lawrence-nnmixtures99
%P --
%R
%U http://inverseprobability.com/publications/lawrence-nnmixtures99.html
%X Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.
TY - CPAPER
TI - A Variational Bayesian Committee of Neural Networks
AU - Neil D. Lawrence
AU - Mehdi Azzouzi
PY - 1999/01/01
DA - 1999/01/01
ID - lawrence-nnmixtures99
SP -
EP -
UR - http://inverseprobability.com/publications/lawrence-nnmixtures99.html
AB - Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.
ER -
Lawrence, N.D. & Azzouzi, M.. (1999). A Variational Bayesian Committee of Neural Networks.:-
@TechReport{/lawrence-nnmixtures99,
title = {A Variational Bayesian Committee of Neural Networks},
author = {Neil D. Lawrence and Mehdi Azzouzi},
year = {1999},
month = {00},
edit = {https://github.com/lawrennd//publications/edit/gh-pages/_posts/1999-01-01-lawrence-nnmixtures99.md},
url = {http://inverseprobability.com/publications/lawrence-nnmixtures99.html},
abstract = {Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.},
key = {Lawrence:nnmixtures99},
linkpsgz = {http://www.thelawrences.net/neil/nnmixture.ps.gz},
OPTgroup = {}
}
%T A Variational Bayesian Committee of Neural Networks
%A Neil D. Lawrence and Mehdi Azzouzi
%B
%D
%F /lawrence-nnmixtures99
%P --
%R
%U http://inverseprobability.com/publications/lawrence-nnmixtures99.html
%X Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.
TY - CPAPER
TI - A Variational Bayesian Committee of Neural Networks
AU - Neil D. Lawrence
AU - Mehdi Azzouzi
PY - 1999/01/01
DA - 1999/01/01
ID - /lawrence-nnmixtures99
SP -
EP -
UR - http://inverseprobability.com/publications/lawrence-nnmixtures99.html
AB - Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we present a general overview of the Bayesian approach with a particular emphasis on the variational procedure. We then present a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented. We present results on a simple toy problem and on two real world data-sets.
ER -
Lawrence, N.D. & Azzouzi, M.. (1999). A Variational Bayesian Committee of Neural Networks.:-