Gaussian Processes and the Null-Category Noise Model

Neil D. LawrenceMichael I. Jordan
, MIT Press :152-165, 2006.

Abstract

With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.

Cite this Paper


BibTeX
@Misc{Lawrence:gpncnm05, title = {Gaussian Processes and the Null-Category Noise Model}, author = {Lawrence, Neil D. and Jordan, Michael I.}, pages = {152--165}, year = {2006}, editor = {Chapelle, Olivier and Schölkopf, Bernhard and Zien, Alex}, address = {Cambridge, MA}, publisher = {MIT Press}, url = {http://inverseprobability.com/publications/lawrence-gpncnm05.html}, abstract = {With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.} }
Endnote
%0 Generic %T Gaussian Processes and the Null-Category Noise Model %A Neil D. Lawrence %A Michael I. Jordan %D 2006 %E Olivier Chapelle %E Bernhard Schölkopf %E Alex Zien %F Lawrence:gpncnm05 %I MIT Press %P 152--165 %U http://inverseprobability.com/publications/lawrence-gpncnm05.html %X With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel.
RIS
TY - GEN TI - Gaussian Processes and the Null-Category Noise Model AU - Neil D. Lawrence AU - Michael I. Jordan BT - Semi-supervised Learning DA - 2006/01/01 ED - Olivier Chapelle ED - Bernhard Schölkopf ED - Alex Zien ID - Lawrence:gpncnm05 PB - MIT Press SP - 152 EP - 165 UR - http://inverseprobability.com/publications/lawrence-gpncnm05.html AB - With Gaussian process classifiers (GPC) we aim to predict the posterior probability of the class labels given an input data point, $p(y_i|x_i)$. In general we find that this posterior distribution is unaffected by unlabeled data points during learning. Support vector machines are strongly related to GPCs, but one notable difference is that the decision boundary in an SVM can be influenced by unlabeled data. The source of this discrepancy is the SVM’s margin: a characteristic which is not shared with the GPC. The presence of the marchin allows the support vector machine to seek low data density regions for the decision boundary, effectively allowing it to incorporate the cluster assumption (see Chapter 6). In this chapter we present the *null category noise model*. A probabilistic equivalent of the margin. By combining this noise model with a GPC we are able to incorporated the cluster assumption without explicitly modeling the input data density distributions and without a special choice of kernel. ER -
APA
Lawrence, N.D. & Jordan, M.I.. (2006). Gaussian Processes and the Null-Category Noise Model. :152-165 Available from http://inverseprobability.com/publications/lawrence-gpncnm05.html.

Related Material