Semi-supervised Learning via Gaussian Processes

Neil D. LawrenceMichael I. Jordan
Advances in Neural Information Processing Systems, MIT Press 17:753-760, 2005.

Abstract

We present a probabilistic approach to learning a Gaussian Process classifier in the presence of unlabeled data. Our approach involves a “null category noise model” (NCNM) inspired by ordered categorical noise models. The noise model reflects an assumption that the data density is lower between the class-conditional densities. We illustrate our approach on a toy problem and present comparative results for the semi-supervised classification of handwritten digits.

Cite this Paper


BibTeX
@InProceedings{Lawrence:semisuper04, title = {Semi-supervised Learning via Gaussian Processes}, author = {Lawrence, Neil D. and Jordan, Michael I.}, booktitle = {Advances in Neural Information Processing Systems}, pages = {753--760}, year = {2005}, editor = {Saul, Lawrence and Weiss, Yair and Bouttou, Léon}, volume = {17}, address = {Cambridge, MA}, publisher = {MIT Press}, pdf = {https://lawrennd.github.io/publications/files/ncnm.pdf}, url = {http://inverseprobability.com/publications/lawrence-semisuper04.html}, abstract = {We present a probabilistic approach to learning a Gaussian Process classifier in the presence of unlabeled data. Our approach involves a “null category noise model” (NCNM) inspired by ordered categorical noise models. The noise model reflects an assumption that the data density is lower between the class-conditional densities. We illustrate our approach on a toy problem and present comparative results for the semi-supervised classification of handwritten digits.} }
Endnote
%0 Conference Paper %T Semi-supervised Learning via Gaussian Processes %A Neil D. Lawrence %A Michael I. Jordan %B Advances in Neural Information Processing Systems %D 2005 %E Lawrence Saul %E Yair Weiss %E Léon Bouttou %F Lawrence:semisuper04 %I MIT Press %P 753--760 %U http://inverseprobability.com/publications/lawrence-semisuper04.html %V 17 %X We present a probabilistic approach to learning a Gaussian Process classifier in the presence of unlabeled data. Our approach involves a “null category noise model” (NCNM) inspired by ordered categorical noise models. The noise model reflects an assumption that the data density is lower between the class-conditional densities. We illustrate our approach on a toy problem and present comparative results for the semi-supervised classification of handwritten digits.
RIS
TY - CPAPER TI - Semi-supervised Learning via Gaussian Processes AU - Neil D. Lawrence AU - Michael I. Jordan BT - Advances in Neural Information Processing Systems DA - 2005/01/01 ED - Lawrence Saul ED - Yair Weiss ED - Léon Bouttou ID - Lawrence:semisuper04 PB - MIT Press VL - 17 SP - 753 EP - 760 L1 - https://lawrennd.github.io/publications/files/ncnm.pdf UR - http://inverseprobability.com/publications/lawrence-semisuper04.html AB - We present a probabilistic approach to learning a Gaussian Process classifier in the presence of unlabeled data. Our approach involves a “null category noise model” (NCNM) inspired by ordered categorical noise models. The noise model reflects an assumption that the data density is lower between the class-conditional densities. We illustrate our approach on a toy problem and present comparative results for the semi-supervised classification of handwritten digits. ER -
APA
Lawrence, N.D. & Jordan, M.I.. (2005). Semi-supervised Learning via Gaussian Processes. Advances in Neural Information Processing Systems 17:753-760 Available from http://inverseprobability.com/publications/lawrence-semisuper04.html.

Related Material