The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine

Neil D. Lawrence, Matthias Seeger, Ralf Herbrich
, 2004.

Abstract

We present a practical probabilistic alternative to the popular support vector machine (SVM). The algorithm is an approximation to a Gaussian process, and is probabilistic in the sense that it maintains the process variance that is implied by the use of a kernel function, which the SVM discards. We show that these variances may be tracked and made use of selection of an active set which gives a sparse representation for the model. For an active set size of $d$ our algorithm exhibits $O(d^{2}N)$ computational complexity and $O(dN)$ storage requirements. It has already been shown that the approach is comptetive with the SVM in terms of performance and running time, here we give more details of the approach and demonstrate that kernel parameters may also be learned in a practical and effective manner.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-lawrence-ivmtech04, title = {The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine}, author = {Neil D. Lawrence and Matthias Seeger and Ralf Herbrich}, year = {}, editor = {}, number = {CS-04-07}, url = {http://inverseprobability.com/publications/lawrence-ivmtech04.html}, abstract = {We present a practical probabilistic alternative to the popular support vector machine (SVM). The algorithm is an approximation to a Gaussian process, and is probabilistic in the sense that it maintains the process variance that is implied by the use of a kernel function, which the SVM discards. We show that these variances may be tracked and made use of selection of an active set which gives a sparse representation for the model. For an active set size of $d$ our algorithm exhibits $O(d^{2}N)$ computational complexity and $O(dN)$ storage requirements. It has already been shown that the approach is comptetive with the SVM in terms of performance and running time, here we give more details of the approach and demonstrate that kernel parameters may also be learned in a practical and effective manner.} }
Endnote
%0 Conference Paper %T The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine %A Neil D. Lawrence %A Matthias Seeger %A Ralf Herbrich %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-lawrence-ivmtech04 %I PMLR %J Proceedings of Machine Learning Research %P -- %U http://inverseprobability.com %V %N CS-04-07 %W PMLR %X We present a practical probabilistic alternative to the popular support vector machine (SVM). The algorithm is an approximation to a Gaussian process, and is probabilistic in the sense that it maintains the process variance that is implied by the use of a kernel function, which the SVM discards. We show that these variances may be tracked and made use of selection of an active set which gives a sparse representation for the model. For an active set size of $d$ our algorithm exhibits $O(d^{2}N)$ computational complexity and $O(dN)$ storage requirements. It has already been shown that the approach is comptetive with the SVM in terms of performance and running time, here we give more details of the approach and demonstrate that kernel parameters may also be learned in a practical and effective manner.
RIS
TY - CPAPER TI - The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine AU - Neil D. Lawrence AU - Matthias Seeger AU - Ralf Herbrich BT - PY - DA - ED - ID - pmlr-v-lawrence-ivmtech04 PB - PMLR SP - DP - PMLR EP - L1 - UR - http://inverseprobability.com/publications/lawrence-ivmtech04.html AB - We present a practical probabilistic alternative to the popular support vector machine (SVM). The algorithm is an approximation to a Gaussian process, and is probabilistic in the sense that it maintains the process variance that is implied by the use of a kernel function, which the SVM discards. We show that these variances may be tracked and made use of selection of an active set which gives a sparse representation for the model. For an active set size of $d$ our algorithm exhibits $O(d^{2}N)$ computational complexity and $O(dN)$ storage requirements. It has already been shown that the approach is comptetive with the SVM in terms of performance and running time, here we give more details of the approach and demonstrate that kernel parameters may also be learned in a practical and effective manner. ER -
APA
Lawrence, N.D., Seeger, M. & Herbrich, R.. (). The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine. , in PMLR (CS-04-07):-

Related Material