Efficient Sampling for Gaussian Process Inference using Control Variables

[edit]

Michalis K. Titsias, University of Athens
Neil D. Lawrence, University of Sheffield
Magnus Rattray, University of Manchester

in Advances in Neural Information Processing Systems 21, pp 1681-1688

Related Material

Abstract

Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation.


@InProceedings{titsias-efficient08,
  title = 	 {Efficient Sampling for Gaussian Process Inference using Control Variables},
  author = 	 {Michalis K. Titsias and Neil D. Lawrence and Magnus Rattray},
  booktitle = 	 {Advances in Neural Information Processing Systems},
  pages = 	 {1681},
  year = 	 {2009},
  editor = 	 {Daphne Koller and Dale Schuurmans and Yoshua Bengio and Leon Bottou},
  volume = 	 {21},
  address = 	 {Cambridge, MA},
  month = 	 {00},
  publisher = 	 {MIT Press},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2009-01-01-titsias-efficient08.md},
  url =  	 {http://inverseprobability.com/publications/titsias-efficient08.html},
  abstract = 	 {Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation.},
  crossref =  {Koller:nips08},
  key = 	 {Titsias:efficient08},
  linkpdf = 	 {ftp://ftp.dcs.shef.ac.uk/home/neil/nipsSamGP08.pdf},
  OPTgroup = 	 {}
 

}
%T Efficient Sampling for Gaussian Process Inference using Control Variables
%A Michalis K. Titsias and Neil D. Lawrence and Magnus Rattray
%B 
%C Advances in Neural Information Processing Systems
%D 
%E Daphne Koller and Dale Schuurmans and Yoshua Bengio and Leon Bottou
%F titsias-efficient08
%I MIT Press	
%P 1681--1688
%R 
%U http://inverseprobability.com/publications/titsias-efficient08.html
%V 21
%X Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation.
TY  - CPAPER
TI  - Efficient Sampling for Gaussian Process Inference using Control Variables
AU  - Michalis K. Titsias
AU  - Neil D. Lawrence
AU  - Magnus Rattray
BT  - Advances in Neural Information Processing Systems
PY  - 2009/01/01
DA  - 2009/01/01
ED  - Daphne Koller
ED  - Dale Schuurmans
ED  - Yoshua Bengio
ED  - Leon Bottou	
ID  - titsias-efficient08
PB  - MIT Press	
SP  - 1681
EP  - 1688
L1  - ftp://ftp.dcs.shef.ac.uk/home/neil/nipsSamGP08.pdf
UR  - http://inverseprobability.com/publications/titsias-efficient08.html
AB  - Sampling functions in Gaussian process (GP) models is challenging because of the highly correlated posterior distribution. We describe an efficient Markov chain Monte Carlo algorithm for sampling from the posterior process of the GP model. This algorithm uses control variables which are auxiliary function values that provide a low dimensional representation of the function. At each iteration, the algorithm proposes new values for the control variables and generates the function from the conditional GP prior. The control variable input locations are found by continuously minimizing an objective function. We demonstrate the algorithm on regression and classification problems and we use it to estimate the parameters of a differential equation model of gene regulation.
ER  -

Titsias, M.K., Lawrence, N.D. & Rattray, M.. (2009). Efficient Sampling for Gaussian Process Inference using Control Variables. Advances in Neural Information Processing Systems 21:1681-1688