Matching Kernels through Kullback-Leibler Divergence Minimisation

[edit]

Neil D. Lawrence, University of Sheffield
Guido Sanguinetti, University of Edinburgh

Related Material

Abstract

In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.


@TechReport{lawrence-matching04,
  title = 	 {Matching Kernels through Kullback-Leibler Divergence Minimisation},
  author = 	 {Neil D. Lawrence and Guido Sanguinetti},
  year = 	 {2004},
  institution = 	 {The University of Sheffield, Department of Computer Science},
  number =       {CS-04-12},
  month = 	 {00},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2004-01-01-lawrence-matching04.md},
  url =  	 {http://inverseprobability.com/publications/lawrence-matching04.html},
  abstract = 	 {In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.},
  key = 	 {Lawrence:matching04},
  linkpsgz =  {"ftp://ftp.dcs.shef.ac.uk/home/neil/" # klobjtech.ps.gz},
  OPTgroup = 	 {}
 

}
%T Matching Kernels through Kullback-Leibler Divergence Minimisation
%A Neil D. Lawrence and Guido Sanguinetti
%B 
%D 
%F lawrence-matching04	
%P --
%R 
%U http://inverseprobability.com/publications/lawrence-matching04.html
%N CS-04-12
%X In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.
TY  - CPAPER
TI  - Matching Kernels through Kullback-Leibler Divergence Minimisation
AU  - Neil D. Lawrence
AU  - Guido Sanguinetti
PY  - 2004/01/01
DA  - 2004/01/01	
ID  - lawrence-matching04	
SP  - 
EP  - 
UR  - http://inverseprobability.com/publications/lawrence-matching04.html
AB  - In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.
ER  -

Lawrence, N.D. & Sanguinetti, G.. (2004). Matching Kernels through Kullback-Leibler Divergence Minimisation.(CS-04-12):-