Variational Learning for Multi-layer networks of Linear Threshold Units

Neil D. Lawrence
:245-252, 2001.

Abstract

Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-lawrence-ltu01, title = {Variational Learning for Multi-layer networks of Linear Threshold Units}, author = {Neil D. Lawrence}, pages = {245--252}, year = {}, editor = {}, address = {San Francisco, CA}, url = {http://inverseprobability.com/publications/lawrence-ltu01.html}, abstract = {Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.} }
Endnote
%0 Conference Paper %T Variational Learning for Multi-layer networks of Linear Threshold Units %A Neil D. Lawrence %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-lawrence-ltu01 %I PMLR %J Proceedings of Machine Learning Research %P 245--252 %U http://inverseprobability.com %V %W PMLR %X Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.
RIS
TY - CPAPER TI - Variational Learning for Multi-layer networks of Linear Threshold Units AU - Neil D. Lawrence BT - PY - DA - ED - ID - pmlr-v-lawrence-ltu01 PB - PMLR SP - 245 DP - PMLR EP - 252 L1 - UR - http://inverseprobability.com/publications/lawrence-ltu01.html AB - Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets. ER -
APA
Lawrence, N.D.. (). Variational Learning for Multi-layer networks of Linear Threshold Units. , in PMLR :245-252

Related Material