Fast Forward Selection to Speed Up Sparse Gaussian Process Regression

Matthias Seeger, Christopher K. I. Williams, Neil D. Lawrence
, 2003.

Abstract

We present a method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection. Our method is essentially as fast as an equivalent one which selects the “support” patterns at random, yet it can outperform random selection on hard curve fitting tasks. More importantly, it leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically. We demonstrate the model selection capabilities of the algorithm in a range of experiments. In line with the development of our method, we present a simple view on sparse approximations for GP models and their underlying assumptions and show relations to other methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-seeger-fast03, title = {Fast Forward Selection to Speed Up Sparse Gaussian Process Regression}, author = {Matthias Seeger and Christopher K. I. Williams and Neil D. Lawrence}, year = {}, editor = {}, address = {Key West, FL}, url = {http://inverseprobability.com/publications/seeger-fast03.html}, abstract = {We present a method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection. Our method is essentially as fast as an equivalent one which selects the “support” patterns at random, yet it can outperform random selection on hard curve fitting tasks. More importantly, it leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically. We demonstrate the model selection capabilities of the algorithm in a range of experiments. In line with the development of our method, we present a simple view on sparse approximations for GP models and their underlying assumptions and show relations to other methods.} }
Endnote
%0 Conference Paper %T Fast Forward Selection to Speed Up Sparse Gaussian Process Regression %A Matthias Seeger %A Christopher K. I. Williams %A Neil D. Lawrence %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-seeger-fast03 %I PMLR %J Proceedings of Machine Learning Research %P -- %U http://inverseprobability.com %V %W PMLR %X We present a method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection. Our method is essentially as fast as an equivalent one which selects the “support” patterns at random, yet it can outperform random selection on hard curve fitting tasks. More importantly, it leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically. We demonstrate the model selection capabilities of the algorithm in a range of experiments. In line with the development of our method, we present a simple view on sparse approximations for GP models and their underlying assumptions and show relations to other methods.
RIS
TY - CPAPER TI - Fast Forward Selection to Speed Up Sparse Gaussian Process Regression AU - Matthias Seeger AU - Christopher K. I. Williams AU - Neil D. Lawrence BT - PY - DA - ED - ID - pmlr-v-seeger-fast03 PB - PMLR SP - DP - PMLR EP - L1 - UR - http://inverseprobability.com/publications/seeger-fast03.html AB - We present a method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection. Our method is essentially as fast as an equivalent one which selects the “support” patterns at random, yet it can outperform random selection on hard curve fitting tasks. More importantly, it leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically. We demonstrate the model selection capabilities of the algorithm in a range of experiments. In line with the development of our method, we present a simple view on sparse approximations for GP models and their underlying assumptions and show relations to other methods. ER -
APA
Seeger, M., Williams, C.K.I. & Lawrence, N.D.. (). Fast Forward Selection to Speed Up Sparse Gaussian Process Regression. , in PMLR :-

Related Material