[edit]

# Gaussian Processes and the Null-Category Noise Model

Neil D. Lawrence, Michael I. Jordan :152-165, 2006.

#### Abstract

With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.

#### Cite this Paper

BibTeX

```
@InProceedings{pmlr-v-lawrence-gpncnm05,
title = {Gaussian Processes and the Null-Category Noise Model},
author = {Neil D. Lawrence and Michael I. Jordan},
pages = {152--165},
year = {},
editor = {},
address = {Cambridge, MA},
url = {http://inverseprobability.com/publications/lawrence-gpncnm05.html},
abstract = {With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.}
}
```

Endnote

```
%0 Conference Paper
%T Gaussian Processes and the Null-Category Noise Model
%A Neil D. Lawrence
%A Michael I. Jordan
%B
%C Proceedings of Machine Learning Research
%D
%E
%F pmlr-v-lawrence-gpncnm05
%I PMLR
%J Proceedings of Machine Learning Research
%P 152--165
%U http://inverseprobability.com
%V
%W PMLR
%X With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.
```

RIS

```
TY - CPAPER
TI - Gaussian Processes and the Null-Category Noise Model
AU - Neil D. Lawrence
AU - Michael I. Jordan
BT -
PY -
DA -
ED -
ID - pmlr-v-lawrence-gpncnm05
PB - PMLR
SP - 152
DP - PMLR
EP - 165
L1 -
UR - http://inverseprobability.com/publications/lawrence-gpncnm05.html
AB - With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.
ER -
```

APA

`Lawrence, N.D. & Jordan, M.I.. (). Gaussian Processes and the Null-Category Noise Model. `*, in PMLR* :152-165

#### Related Material

BibTeX

```
@InProceedings{/lawrence-gpncnm05,
title = {Gaussian Processes and the Null-Category Noise Model},
author = {Neil D. Lawrence and Michael I. Jordan},
pages = {152--165},
year = {},
editor = {},
address = {Cambridge, MA},
url = {http://inverseprobability.com/publications/lawrence-gpncnm05.html},
abstract = {With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.}
}
```

Endnote

```
%0 Conference Paper
%T Gaussian Processes and the Null-Category Noise Model
%A Neil D. Lawrence
%A Michael I. Jordan
%B
%C Proceedings of Machine Learning Research
%D
%E
%F /lawrence-gpncnm05
%I PMLR
%J Proceedings of Machine Learning Research
%P 152--165
%U http://inverseprobability.com
%V
%W PMLR
%X With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.
```

RIS

```
TY - CPAPER
TI - Gaussian Processes and the Null-Category Noise Model
AU - Neil D. Lawrence
AU - Michael I. Jordan
BT -
PY -
DA -
ED -
ID - /lawrence-gpncnm05
PB - PMLR
SP - 152
DP - PMLR
EP - 165
L1 -
UR - http://inverseprobability.com/publications/lawrence-gpncnm05.html
AB - With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.
ER -
```

APA

`Lawrence, N.D. & Jordan, M.I.. (). Gaussian Processes and the Null-Category Noise Model. `*, in PMLR* :152-165