Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on mixtures of Gaussian distributions and show how it may be implemented.

@TechReport{lawrence-structure01,
title = {The Structure of Neural Network Posteriors},
author = {Neil D. Lawrence and Mehdi Azzouzi},
year = {2001},
month = {00},
edit = {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2001-01-01-lawrence-structure01.md},
url = {http://inverseprobability.com/publications/lawrence-structure01.html},
abstract = {Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.},
key = {Lawrence:structure01},
note = {},
linkpdf = {http://www.thelawrences.net/neil/mixture.pdf},
linkpsgz = {http://www.thelawrences.net/neil/mixture.ps.gz},
OPTgroup = {}
}

%T The Structure of Neural Network Posteriors
%A Neil D. Lawrence and Mehdi Azzouzi
%B
%D
%F lawrence-structure01
%P --
%R
%U http://inverseprobability.com/publications/lawrence-structure01.html
%X Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.

TY - CPAPER
TI - The Structure of Neural Network Posteriors
AU - Neil D. Lawrence
AU - Mehdi Azzouzi
PY - 2001/01/01
DA - 2001/01/01
ID - lawrence-structure01
SP -
EP -
L1 - http://www.thelawrences.net/neil/mixture.pdf
UR - http://inverseprobability.com/publications/lawrence-structure01.html
AB - Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.
ER -

Lawrence, N.D. & Azzouzi, M.. (2001). The Structure of Neural Network Posteriors.:-

@TechReport{/lawrence-structure01,
title = {The Structure of Neural Network Posteriors},
author = {Neil D. Lawrence and Mehdi Azzouzi},
year = {2001},
month = {00},
edit = {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2001-01-01-lawrence-structure01.md},
url = {http://inverseprobability.com/publications/lawrence-structure01.html},
abstract = {Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.},
key = {Lawrence:structure01},
note = {},
linkpdf = {http://www.thelawrences.net/neil/mixture.pdf},
linkpsgz = {http://www.thelawrences.net/neil/mixture.ps.gz},
OPTgroup = {}
}

%T The Structure of Neural Network Posteriors
%A Neil D. Lawrence and Mehdi Azzouzi
%B
%D
%F /lawrence-structure01
%P --
%R
%U http://inverseprobability.com/publications/lawrence-structure01.html
%X Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.

TY - CPAPER
TI - The Structure of Neural Network Posteriors
AU - Neil D. Lawrence
AU - Mehdi Azzouzi
PY - 2001/01/01
DA - 2001/01/01
ID - /lawrence-structure01
SP -
EP -
L1 - http://www.thelawrences.net/neil/mixture.pdf
UR - http://inverseprobability.com/publications/lawrence-structure01.html
AB - Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.
ER -

Lawrence, N.D. & Azzouzi, M.. (2001). The Structure of Neural Network Posteriors.:-