[edit]

# Matching Kernels through Kullback-Leibler Divergence Minimisation

Neil D. Lawrence, Guido Sanguinetti , 2004.

#### Abstract

In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.

#### Cite this Paper

BibTeX

```
@InProceedings{pmlr-v-lawrence-matching04,
title = {Matching Kernels through Kullback-Leibler Divergence Minimisation},
author = {Neil D. Lawrence and Guido Sanguinetti},
year = {},
editor = {},
number = {CS-04-12},
url = {http://inverseprobability.com/publications/lawrence-matching04.html},
abstract = {In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.}
}
```

Endnote

```
%0 Conference Paper
%T Matching Kernels through Kullback-Leibler Divergence Minimisation
%A Neil D. Lawrence
%A Guido Sanguinetti
%B
%C Proceedings of Machine Learning Research
%D
%E
%F pmlr-v-lawrence-matching04
%I PMLR
%J Proceedings of Machine Learning Research
%P --
%U http://inverseprobability.com
%V
%N CS-04-12
%W PMLR
%X In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.
```

RIS

```
TY - CPAPER
TI - Matching Kernels through Kullback-Leibler Divergence Minimisation
AU - Neil D. Lawrence
AU - Guido Sanguinetti
BT -
PY -
DA -
ED -
ID - pmlr-v-lawrence-matching04
PB - PMLR
SP -
DP - PMLR
EP -
L1 -
UR - http://inverseprobability.com/publications/lawrence-matching04.html
AB - In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.
ER -
```

APA

`Lawrence, N.D. & Sanguinetti, G.. (). Matching Kernels through Kullback-Leibler Divergence Minimisation. `*, in PMLR* (CS-04-12):-

#### Related Material

BibTeX

```
@InProceedings{/lawrence-matching04,
title = {Matching Kernels through Kullback-Leibler Divergence Minimisation},
author = {Neil D. Lawrence and Guido Sanguinetti},
year = {},
editor = {},
number = {CS-04-12},
url = {http://inverseprobability.com/publications/lawrence-matching04.html},
abstract = {In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.}
}
```

Endnote

```
%0 Conference Paper
%T Matching Kernels through Kullback-Leibler Divergence Minimisation
%A Neil D. Lawrence
%A Guido Sanguinetti
%B
%C Proceedings of Machine Learning Research
%D
%E
%F /lawrence-matching04
%I PMLR
%J Proceedings of Machine Learning Research
%P --
%U http://inverseprobability.com
%V
%N CS-04-12
%W PMLR
%X In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.
```

RIS

```
TY - CPAPER
TI - Matching Kernels through Kullback-Leibler Divergence Minimisation
AU - Neil D. Lawrence
AU - Guido Sanguinetti
BT -
PY -
DA -
ED -
ID - /lawrence-matching04
PB - PMLR
SP -
DP - PMLR
EP -
L1 -
UR - http://inverseprobability.com/publications/lawrence-matching04.html
AB - In this paper we study the general constrained minimisation of Kullback-Leibler (KL) divergences between two zero mean Gaussian distributions. We reduce the problem to an equivalent minimisation involving the eigenvectors of the two kernel matrices, and provide explicit solutions in some cases. We then focus, as an example, on the important case of constraining the approximating matrix to be block diagonal. We prove a stability result on the approximating matrix, and speculate on how these results may be used to give further theoretical foundation to widely used techniques such as spectral clustering.
ER -
```

APA

`Lawrence, N.D. & Sanguinetti, G.. (). Matching Kernels through Kullback-Leibler Divergence Minimisation. `*, in PMLR* (CS-04-12):-