Variational Learning for Multi-layer networks of Linear Threshold Units

Neil D. Lawrence
, 2000.

Abstract

Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.

Cite this Paper


BibTeX
@Misc{Lawrence:ltu_report00, title = {Variational Learning for Multi-layer networks of Linear Threshold Units}, author = {Lawrence, Neil D.}, year = {2000}, pdf = {http://www.thelawrences.net/neil/ltupaper.pdf}, url = {http://inverseprobability.com/publications/lawrence-ltu_report00.html}, abstract = {Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.}, note = {Draft report, slightly extended version of \cite{Lawrence:ltu01}.} }
Endnote
%0 Generic %T Variational Learning for Multi-layer networks of Linear Threshold Units %A Neil D. Lawrence %D 2000 %F Lawrence:ltu_report00 %U http://inverseprobability.com/publications/lawrence-ltu_report00.html %X Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets. %Z Draft report, slightly extended version of \cite{Lawrence:ltu01}.
RIS
TY - GEN TI - Variational Learning for Multi-layer networks of Linear Threshold Units AU - Neil D. Lawrence DA - 2000/02/05 ID - Lawrence:ltu_report00 L1 - http://www.thelawrences.net/neil/ltupaper.pdf UR - http://inverseprobability.com/publications/lawrence-ltu_report00.html AB - Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets. N1 - Draft report, slightly extended version of \cite{Lawrence:ltu01}. ER -
APA
Lawrence, N.D.. (2000). Variational Learning for Multi-layer networks of Linear Threshold Units. Available from http://inverseprobability.com/publications/lawrence-ltu_report00.html. Draft report, slightly extended version of \cite{Lawrence:ltu01}.

Related Material