The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine

Neil D. Lawrence, Matthias Seeger, Ralf Herbrich
, 2004.

Abstract

We present a practical probabilistic alternative to the popular support vector machine (SVM). The algorithm is an approximation to a Gaussian process, and is probabilistic in the sense that it maintains the process variance that is implied by the use of a kernel function, which the SVM discards. We show that these variances may be tracked and made use of selection of an active set which gives a sparse representation for the model. For an active set size of $d$ our algorithm exhibits $O(d^{2}N)$ computational complexity and $O(dN)$ storage requirements. It has already been shown that the approach is comptetive with the SVM in terms of performance and running time, here we give more details of the approach and demonstrate that kernel parameters may also be learned in a practical and effective manner.

Cite this Paper


BibTeX
@Misc{Lawrence:ivmTech04, title = {The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine}, author = {Lawrence, Neil D. and Seeger, Matthias and Herbrich, Ralf}, year = {2004}, number = {CS-04-07}, url = {http://inverseprobability.com/publications/lawrence-ivmtech04.html}, abstract = {We present a practical probabilistic alternative to the popular support vector machine (SVM). The algorithm is an approximation to a Gaussian process, and is probabilistic in the sense that it maintains the process variance that is implied by the use of a kernel function, which the SVM discards. We show that these variances may be tracked and made use of selection of an active set which gives a sparse representation for the model. For an active set size of $d$ our algorithm exhibits $O(d^{2}N)$ computational complexity and $O(dN)$ storage requirements. It has already been shown that the approach is comptetive with the SVM in terms of performance and running time, here we give more details of the approach and demonstrate that kernel parameters may also be learned in a practical and effective manner.}, note = {Last updated December 2005} }
Endnote
%0 Generic %T The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine %A Neil D. Lawrence %A Matthias Seeger %A Ralf Herbrich %D 2004 %F Lawrence:ivmTech04 %U http://inverseprobability.com/publications/lawrence-ivmtech04.html %N CS-04-07 %X We present a practical probabilistic alternative to the popular support vector machine (SVM). The algorithm is an approximation to a Gaussian process, and is probabilistic in the sense that it maintains the process variance that is implied by the use of a kernel function, which the SVM discards. We show that these variances may be tracked and made use of selection of an active set which gives a sparse representation for the model. For an active set size of $d$ our algorithm exhibits $O(d^{2}N)$ computational complexity and $O(dN)$ storage requirements. It has already been shown that the approach is comptetive with the SVM in terms of performance and running time, here we give more details of the approach and demonstrate that kernel parameters may also be learned in a practical and effective manner. %Z Last updated December 2005
RIS
TY - GEN TI - The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine AU - Neil D. Lawrence AU - Matthias Seeger AU - Ralf Herbrich DA - 2004/01/01 ID - Lawrence:ivmTech04 IS - CS-04-07 UR - http://inverseprobability.com/publications/lawrence-ivmtech04.html AB - We present a practical probabilistic alternative to the popular support vector machine (SVM). The algorithm is an approximation to a Gaussian process, and is probabilistic in the sense that it maintains the process variance that is implied by the use of a kernel function, which the SVM discards. We show that these variances may be tracked and made use of selection of an active set which gives a sparse representation for the model. For an active set size of $d$ our algorithm exhibits $O(d^{2}N)$ computational complexity and $O(dN)$ storage requirements. It has already been shown that the approach is comptetive with the SVM in terms of performance and running time, here we give more details of the approach and demonstrate that kernel parameters may also be learned in a practical and effective manner. N1 - Last updated December 2005 ER -
APA
Lawrence, N.D., Seeger, M. & Herbrich, R.. (2004). The Informative Vector Machine: A Practical Probabilistic Alternative to the Support Vector Machine. (CS-04-07) Available from http://inverseprobability.com/publications/lawrence-ivmtech04.html. Last updated December 2005

Related Material