Matching Kernels through Kullback-Leibler Divergence Minimisation

Neil D. LawrenceGuido Sanguinetti
, 2004.

Abstract

In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.

Cite this Paper


BibTeX
@Misc{Lawrence:matching04, title = {Matching Kernels through Kullback-Leibler Divergence Minimisation}, author = {Lawrence, Neil D. and Sanguinetti, Guido}, year = {2004}, number = {CS-04-12}, url = {http://inverseprobability.com/publications/lawrence-matching04.html}, abstract = {In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.} }
Endnote
%0 Generic %T Matching Kernels through Kullback-Leibler Divergence Minimisation %A Neil D. Lawrence %A Guido Sanguinetti %D 2004 %F Lawrence:matching04 %U http://inverseprobability.com/publications/lawrence-matching04.html %N CS-04-12 %X In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.
RIS
TY - GEN TI - Matching Kernels through Kullback-Leibler Divergence Minimisation AU - Neil D. Lawrence AU - Guido Sanguinetti DA - 2004/01/01 ID - Lawrence:matching04 IS - CS-04-12 UR - http://inverseprobability.com/publications/lawrence-matching04.html AB - In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering. ER -
APA
Lawrence, N.D. & Sanguinetti, G.. (2004). Matching Kernels through Kullback-Leibler Divergence Minimisation. (CS-04-12) Available from http://inverseprobability.com/publications/lawrence-matching04.html.

Related Material