[edit]

# Variational Learning for Multi-layer networks of Linear Threshold Units

Neil D. Lawrence , 2000.

#### Abstract

Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.

#### Cite this Paper

BibTeX

```
@InProceedings{pmlr-v-lawrence-ltu_report00,
title = {Variational Learning for Multi-layer networks of Linear Threshold Units},
author = {Neil D. Lawrence},
year = {},
editor = {},
url = {http://inverseprobability.com/publications/lawrence-ltu_report00.html},
abstract = {Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.}
}
```

Endnote

```
%0 Conference Paper
%T Variational Learning for Multi-layer networks of Linear Threshold Units
%A Neil D. Lawrence
%B
%C Proceedings of Machine Learning Research
%D
%E
%F pmlr-v-lawrence-ltu_report00
%I PMLR
%J Proceedings of Machine Learning Research
%P --
%U http://inverseprobability.com
%V
%W PMLR
%X Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.
```

RIS

```
TY - CPAPER
TI - Variational Learning for Multi-layer networks of Linear Threshold Units
AU - Neil D. Lawrence
BT -
PY -
DA -
ED -
ID - pmlr-v-lawrence-ltu_report00
PB - PMLR
SP -
DP - PMLR
EP -
L1 -
UR - http://inverseprobability.com/publications/lawrence-ltu_report00.html
AB - Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.
ER -
```

APA

`Lawrence, N.D.. (). Variational Learning for Multi-layer networks of Linear Threshold Units. `*, in PMLR* :-

#### Related Material

BibTeX

```
@InProceedings{/lawrence-ltu_report00,
title = {Variational Learning for Multi-layer networks of Linear Threshold Units},
author = {Neil D. Lawrence},
year = {},
editor = {},
url = {http://inverseprobability.com/publications/lawrence-ltu_report00.html},
abstract = {Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.}
}
```

Endnote

```
%0 Conference Paper
%T Variational Learning for Multi-layer networks of Linear Threshold Units
%A Neil D. Lawrence
%B
%C Proceedings of Machine Learning Research
%D
%E
%F /lawrence-ltu_report00
%I PMLR
%J Proceedings of Machine Learning Research
%P --
%U http://inverseprobability.com
%V
%W PMLR
%X Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.
```

RIS

```
TY - CPAPER
TI - Variational Learning for Multi-layer networks of Linear Threshold Units
AU - Neil D. Lawrence
BT -
PY -
DA -
ED -
ID - /lawrence-ltu_report00
PB - PMLR
SP -
DP - PMLR
EP -
L1 -
UR - http://inverseprobability.com/publications/lawrence-ltu_report00.html
AB - Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.
ER -
```

APA

`Lawrence, N.D.. (). Variational Learning for Multi-layer networks of Linear Threshold Units. `*, in PMLR* :-