[edit]

# The Structure of Neural Network Posteriors

Neil D. Lawrence, Mehdi Azzouzi , 2001.

#### Abstract

Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.

#### Cite this Paper

BibTeX

```
@InProceedings{pmlr-v-lawrence-structure01,
title = {The Structure of Neural Network Posteriors},
author = {Neil D. Lawrence and Mehdi Azzouzi},
year = {},
editor = {},
url = {http://inverseprobability.com/publications/lawrence-structure01.html},
abstract = {Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.}
}
```

Endnote

```
%0 Conference Paper
%T The Structure of Neural Network Posteriors
%A Neil D. Lawrence
%A Mehdi Azzouzi
%B
%C Proceedings of Machine Learning Research
%D
%E
%F pmlr-v-lawrence-structure01
%I PMLR
%J Proceedings of Machine Learning Research
%P --
%U http://inverseprobability.com
%V
%W PMLR
%X Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.
```

RIS

```
TY - CPAPER
TI - The Structure of Neural Network Posteriors
AU - Neil D. Lawrence
AU - Mehdi Azzouzi
BT -
PY -
DA -
ED -
ID - pmlr-v-lawrence-structure01
PB - PMLR
SP -
DP - PMLR
EP -
L1 -
UR - http://inverseprobability.com/publications/lawrence-structure01.html
AB - Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.
ER -
```

APA

`Lawrence, N.D. & Azzouzi, M.. (). The Structure of Neural Network Posteriors. `*, in PMLR* :-

#### Related Material

BibTeX

```
@InProceedings{/lawrence-structure01,
title = {The Structure of Neural Network Posteriors},
author = {Neil D. Lawrence and Mehdi Azzouzi},
year = {},
editor = {},
url = {http://inverseprobability.com/publications/lawrence-structure01.html},
abstract = {Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.}
}
```

Endnote

```
%0 Conference Paper
%T The Structure of Neural Network Posteriors
%A Neil D. Lawrence
%A Mehdi Azzouzi
%B
%C Proceedings of Machine Learning Research
%D
%E
%F /lawrence-structure01
%I PMLR
%J Proceedings of Machine Learning Research
%P --
%U http://inverseprobability.com
%V
%W PMLR
%X Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.
```

RIS

```
TY - CPAPER
TI - The Structure of Neural Network Posteriors
AU - Neil D. Lawrence
AU - Mehdi Azzouzi
BT -
PY -
DA -
ED -
ID - /lawrence-structure01
PB - PMLR
SP -
DP - PMLR
EP -
L1 -
UR - http://inverseprobability.com/publications/lawrence-structure01.html
AB - Exact inference in Bayesian neural networks is non analytic to compute and as a result approximate approaches such as the evidence procedure, Monte-Carlo sampling and variational inference have been proposed. In this paper we explore the structure of the posterior distributions in such a model through a new approximating distribution based on *mixtures* of Gaussian distributions and show how it may be implemented.
ER -
```

APA

`Lawrence, N.D. & Azzouzi, M.. (). The Structure of Neural Network Posteriors. `*, in PMLR* :-