[edit]

# Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis

Tonatiuh Peña-Centeno, Neil D. Lawrence, 7:455-491, 2006.

#### Abstract

In this paper we consider a novel Bayesian interpretation of Fisher’s discriminiant analysis. We relate Rayleigh’s coefficient to a noise model that minimizes a cost based on the most probable class centres and that abandons the ‘regression to the labels’ assumption used by other algorithms. This yields a direction of discrimination equivalent to Fisher’s discriminant. We use Bayes’ rule to infer the posterior distribution for the direction of discrimination and in this process, priors and constraining distributions are incorporated to reach the desired result. Going further, with the use of a Gaussian process prior we show the equivalence of our model to a regularised kernel Fisher’s discriminant. A key advantage of our approach is the facility to determine kernel parameters and the regularisation coefficient through the optimisation of the marginal log-likelihood of the data. An added bonus of the new formulation is that it enables us to link the regularisation coefficient with the generalisation error.

#### Cite this Paper

BibTeX

```
@InProceedings{pmlr-v-pena-fbd04,
title = {Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis},
author = {Tonatiuh Peña-Centeno and Neil D. Lawrence},
pages = {455--491},
year = {},
editor = {},
volume = {7},
url = {http://inverseprobability.com/publications/pena-fbd04.html},
abstract = {In this paper we consider a novel Bayesian interpretation of Fisher’s discriminiant analysis. We relate Rayleigh’s coefficient to a noise model that minimizes a cost based on the most probable class centres and that abandons the ‘regression to the labels’ assumption used by other algorithms. This yields a direction of discrimination equivalent to Fisher’s discriminant. We use Bayes’ rule to infer the posterior distribution for the direction of discrimination and in this process, priors and constraining distributions are incorporated to reach the desired result. Going further, with the use of a Gaussian process prior we show the equivalence of our model to a regularised kernel Fisher’s discriminant. A key advantage of our approach is the facility to determine kernel parameters and the regularisation coefficient through the optimisation of the marginal log-likelihood of the data. An added bonus of the new formulation is that it enables us to link the regularisation coefficient with the generalisation error.}
}
```

Endnote

```
%0 Conference Paper
%T Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
%A Tonatiuh Peña-Centeno
%A Neil D. Lawrence
%B
%C Proceedings of Machine Learning Research
%D
%E
%F pmlr-v-pena-fbd04
%I PMLR
%J Proceedings of Machine Learning Research
%P 455--491
%U http://inverseprobability.com
%V
%W PMLR
%X In this paper we consider a novel Bayesian interpretation of Fisher’s discriminiant analysis. We relate Rayleigh’s coefficient to a noise model that minimizes a cost based on the most probable class centres and that abandons the ‘regression to the labels’ assumption used by other algorithms. This yields a direction of discrimination equivalent to Fisher’s discriminant. We use Bayes’ rule to infer the posterior distribution for the direction of discrimination and in this process, priors and constraining distributions are incorporated to reach the desired result. Going further, with the use of a Gaussian process prior we show the equivalence of our model to a regularised kernel Fisher’s discriminant. A key advantage of our approach is the facility to determine kernel parameters and the regularisation coefficient through the optimisation of the marginal log-likelihood of the data. An added bonus of the new formulation is that it enables us to link the regularisation coefficient with the generalisation error.
```

RIS

```
TY - CPAPER
TI - Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
AU - Tonatiuh Peña-Centeno
AU - Neil D. Lawrence
BT -
PY -
DA -
ED -
ID - pmlr-v-pena-fbd04
PB - PMLR
SP - 455
DP - PMLR
EP - 491
L1 -
UR - http://inverseprobability.com/publications/pena-fbd04.html
AB - In this paper we consider a novel Bayesian interpretation of Fisher’s discriminiant analysis. We relate Rayleigh’s coefficient to a noise model that minimizes a cost based on the most probable class centres and that abandons the ‘regression to the labels’ assumption used by other algorithms. This yields a direction of discrimination equivalent to Fisher’s discriminant. We use Bayes’ rule to infer the posterior distribution for the direction of discrimination and in this process, priors and constraining distributions are incorporated to reach the desired result. Going further, with the use of a Gaussian process prior we show the equivalence of our model to a regularised kernel Fisher’s discriminant. A key advantage of our approach is the facility to determine kernel parameters and the regularisation coefficient through the optimisation of the marginal log-likelihood of the data. An added bonus of the new formulation is that it enables us to link the regularisation coefficient with the generalisation error.
ER -
```

APA

`Peña-Centeno, T. & Lawrence, N.D.. (). Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis. `*, in PMLR* :455-491

#### Related Material

BibTeX

```
@InProceedings{/pena-fbd04,
title = {Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis},
author = {Tonatiuh Peña-Centeno and Neil D. Lawrence},
pages = {455--491},
year = {},
editor = {},
volume = {7},
url = {http://inverseprobability.com/publications/pena-fbd04.html},
abstract = {In this paper we consider a novel Bayesian interpretation of Fisher’s discriminiant analysis. We relate Rayleigh’s coefficient to a noise model that minimizes a cost based on the most probable class centres and that abandons the ‘regression to the labels’ assumption used by other algorithms. This yields a direction of discrimination equivalent to Fisher’s discriminant. We use Bayes’ rule to infer the posterior distribution for the direction of discrimination and in this process, priors and constraining distributions are incorporated to reach the desired result. Going further, with the use of a Gaussian process prior we show the equivalence of our model to a regularised kernel Fisher’s discriminant. A key advantage of our approach is the facility to determine kernel parameters and the regularisation coefficient through the optimisation of the marginal log-likelihood of the data. An added bonus of the new formulation is that it enables us to link the regularisation coefficient with the generalisation error.}
}
```

Endnote

```
%0 Conference Paper
%T Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
%A Tonatiuh Peña-Centeno
%A Neil D. Lawrence
%B
%C Proceedings of Machine Learning Research
%D
%E
%F /pena-fbd04
%I PMLR
%J Proceedings of Machine Learning Research
%P 455--491
%U http://inverseprobability.com
%V
%W PMLR
%X In this paper we consider a novel Bayesian interpretation of Fisher’s discriminiant analysis. We relate Rayleigh’s coefficient to a noise model that minimizes a cost based on the most probable class centres and that abandons the ‘regression to the labels’ assumption used by other algorithms. This yields a direction of discrimination equivalent to Fisher’s discriminant. We use Bayes’ rule to infer the posterior distribution for the direction of discrimination and in this process, priors and constraining distributions are incorporated to reach the desired result. Going further, with the use of a Gaussian process prior we show the equivalence of our model to a regularised kernel Fisher’s discriminant. A key advantage of our approach is the facility to determine kernel parameters and the regularisation coefficient through the optimisation of the marginal log-likelihood of the data. An added bonus of the new formulation is that it enables us to link the regularisation coefficient with the generalisation error.
```

RIS

```
TY - CPAPER
TI - Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
AU - Tonatiuh Peña-Centeno
AU - Neil D. Lawrence
BT -
PY -
DA -
ED -
ID - /pena-fbd04
PB - PMLR
SP - 455
DP - PMLR
EP - 491
L1 -
UR - http://inverseprobability.com/publications/pena-fbd04.html
AB - In this paper we consider a novel Bayesian interpretation of Fisher’s discriminiant analysis. We relate Rayleigh’s coefficient to a noise model that minimizes a cost based on the most probable class centres and that abandons the ‘regression to the labels’ assumption used by other algorithms. This yields a direction of discrimination equivalent to Fisher’s discriminant. We use Bayes’ rule to infer the posterior distribution for the direction of discrimination and in this process, priors and constraining distributions are incorporated to reach the desired result. Going further, with the use of a Gaussian process prior we show the equivalence of our model to a regularised kernel Fisher’s discriminant. A key advantage of our approach is the facility to determine kernel parameters and the regularisation coefficient through the optimisation of the marginal log-likelihood of the data. An added bonus of the new formulation is that it enables us to link the regularisation coefficient with the generalisation error.
ER -
```

APA

`Peña-Centeno, T. & Lawrence, N.D.. (). Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis. `*, in PMLR* :455-491