Variational Inference Guide

[edit]

Neil D. Lawrence, University of Sheffield

Related Material

Abstract

This report is a brief introduction to variational inference for Bayesian models from the perspective of the Expectation Maximisation (EM) algorithm @Dempster:EM77. We start with an overview of the EM algorithm from the perspective of variational inference and then we show how approximate inference may also be performed. We discuss briefly when variational inference may be used and finally we mention the variational importance sampler as an alternative approach.


@TechReport{lawrence-variationalguide02,
  title = 	 {Variational Inference Guide},
  author = 	 {Neil D. Lawrence},
  year = 	 {2002},
  institution = 	 {University of Sheffield},
  month = 	 {00},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2002-01-01-lawrence-variationalguide02.md},
  url =  	 {http://inverseprobability.com/publications/lawrence-variationalguide02.html},
  abstract = 	 {This report is a brief introduction to variational inference for Bayesian models from the perspective of the Expectation Maximisation (EM) algorithm @Dempster:EM77. We start with an overview of the EM algorithm from the perspective of variational inference and then we show how approximate inference may also be performed. We discuss briefly when variational inference may be used and finally we mention the variational importance sampler as an alternative approach.},
  key = 	 {lawrence:variationalguide02},
  linkpdf = 	 {ftp://ftp.dcs.shef.ac.uk/home/neil/variationalInference.pdf},
  linkpsgz =  {ftp://ftp.dcs.shef.ac.uk/home/neil/variationalInference.ps.gz},
  group = 	 {shefml}
 

}
%T Variational Inference Guide
%A Neil D. Lawrence
%B 
%D 
%F lawrence-variationalguide02	
%P --
%R 
%U http://inverseprobability.com/publications/lawrence-variationalguide02.html
%X This report is a brief introduction to variational inference for Bayesian models from the perspective of the Expectation Maximisation (EM) algorithm @Dempster:EM77. We start with an overview of the EM algorithm from the perspective of variational inference and then we show how approximate inference may also be performed. We discuss briefly when variational inference may be used and finally we mention the variational importance sampler as an alternative approach.
TY  - CPAPER
TI  - Variational Inference Guide
AU  - Neil D. Lawrence
PY  - 2002/01/01
DA  - 2002/01/01	
ID  - lawrence-variationalguide02	
SP  - 
EP  - 
L1  - ftp://ftp.dcs.shef.ac.uk/home/neil/variationalInference.pdf
UR  - http://inverseprobability.com/publications/lawrence-variationalguide02.html
AB  - This report is a brief introduction to variational inference for Bayesian models from the perspective of the Expectation Maximisation (EM) algorithm @Dempster:EM77. We start with an overview of the EM algorithm from the perspective of variational inference and then we show how approximate inference may also be performed. We discuss briefly when variational inference may be used and finally we mention the variational importance sampler as an alternative approach.
ER  -

Lawrence, N.D.. (2002). Variational Inference Guide.:-