Efficient Sampling for Gaussian Process Inference using Control Variables

Michalis K. TitsiasNeil D. LawrenceMagnus Rattray
Advances in Neural Information Processing Systems, MIT Press 21:1681-1688, 2008.

Abstract

Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation.

Cite this Paper


BibTeX
@InProceedings{Titsias-efficient08, title = {Efficient Sampling for {G}aussian Process Inference using Control Variables}, author = {Titsias, Michalis K. and Lawrence, Neil D. and Rattray, Magnus}, booktitle = {Advances in Neural Information Processing Systems}, pages = {1681--1688}, year = {2008}, editor = {Koller, Daphne and Schuurmans, Dale and Bengio, Yoshua and Bottou, Leon}, volume = {21}, address = {Cambridge, MA}, publisher = {MIT Press}, pdf = {https://proceedings.neurips.cc/paper/2008/file/5487315b1286f907165907aa8fc96619-Paper.pdf}, url = {http://inverseprobability.com/publications/titsias-efficient08.html}, abstract = {Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation.} }
Endnote
%0 Conference Paper %T Efficient Sampling for Gaussian Process Inference using Control Variables %A Michalis K. Titsias %A Neil D. Lawrence %A Magnus Rattray %B Advances in Neural Information Processing Systems %D 2008 %E Daphne Koller %E Dale Schuurmans %E Yoshua Bengio %E Leon Bottou %F Titsias-efficient08 %I MIT Press %P 1681--1688 %U http://inverseprobability.com/publications/titsias-efficient08.html %V 21 %X Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation.
RIS
TY - CPAPER TI - Efficient Sampling for Gaussian Process Inference using Control Variables AU - Michalis K. Titsias AU - Neil D. Lawrence AU - Magnus Rattray BT - Advances in Neural Information Processing Systems DA - 2008/12/08 ED - Daphne Koller ED - Dale Schuurmans ED - Yoshua Bengio ED - Leon Bottou ID - Titsias-efficient08 PB - MIT Press VL - 21 SP - 1681 EP - 1688 L1 - https://proceedings.neurips.cc/paper/2008/file/5487315b1286f907165907aa8fc96619-Paper.pdf UR - http://inverseprobability.com/publications/titsias-efficient08.html AB - Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation. ER -
APA
Titsias, M.K., Lawrence, N.D. & Rattray, M.. (2008). Efficient Sampling for Gaussian Process Inference using Control Variables. Advances in Neural Information Processing Systems 21:1681-1688 Available from http://inverseprobability.com/publications/titsias-efficient08.html.

Related Material