Variational Learning for Multi-layer networks of Linear Threshold Units

[edit]

Neil D. Lawrence, University of Sheffield

in Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics, pp 245-252

Related Material

Abstract

Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.


@InProceedings{lawrence-ltu01,
  title = 	 {Variational Learning for Multi-layer networks of Linear Threshold Units},
  author = 	 {Neil D. Lawrence},
  booktitle = 	 {Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics},
  pages = 	 {245},
  year = 	 {2001},
  editor = 	 {Tommi S. Jaakkola and Thomas S. Richardson},
  address = 	 {San Francisco, CA},
  month = 	 {00},
  publisher = 	 {Morgan Kauffman},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2001-01-01-lawrence-ltu01.md},
  url =  	 {http://inverseprobability.com/publications/lawrence-ltu01.html},
  abstract = 	 {Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.},
  crossref =  {Jaakkola:aistats01},
  key = 	 {Lawrence:ltu01},
  linkpdf = 	 {http://www.thelawrences.net/neil/ltus.pdf},
  linkpsgz =  {http://www.thelawrences.net/neil/ltus.ps.gz},
  OPTgroup = 	 {}
 

}
%T Variational Learning for Multi-layer networks of Linear Threshold Units
%A Neil D. Lawrence
%B 
%C Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics
%D 
%E Tommi S. Jaakkola and Thomas S. Richardson
%F lawrence-ltu01
%I Morgan Kauffman	
%P 245--252
%R 
%U http://inverseprobability.com/publications/lawrence-ltu01.html
%X Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.
TY  - CPAPER
TI  - Variational Learning for Multi-layer networks of Linear Threshold Units
AU  - Neil D. Lawrence
BT  - Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics
PY  - 2001/01/01
DA  - 2001/01/01
ED  - Tommi S. Jaakkola
ED  - Thomas S. Richardson	
ID  - lawrence-ltu01
PB  - Morgan Kauffman	
SP  - 245
EP  - 252
L1  - http://www.thelawrences.net/neil/ltus.pdf
UR  - http://inverseprobability.com/publications/lawrence-ltu01.html
AB  - Linear threshold units were originally proposed as models of biological neurons. They were widely studied in the context of the perceptron @Rosenblatt:book62. Due to the difficulties of finding a general algorithm for networks with hidden nodes, they never passed into general use. We derive an algorithm in the context of graphical models and show how it may be applied in multi-layer networks of linear threshold units. We demonstrate the algorithm through three well known datasets.
ER  -

Lawrence, N.D.. (2001). Variational Learning for Multi-layer networks of Linear Threshold Units. Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics :245-252