Efficient Multioutput Gaussian Processes through Variational Inducing Kernels

[edit]

Mauricio A. Álvarez, Universidad Tecnológica de Pereira, Colombia
David Luengo, Universidad Politécnica de Madrid
Michalis K. Titsias, University of Athens
Neil D. Lawrence, University of Sheffield

in Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics 9, pp 25-32

Related Material

Abstract

Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.


@InProceedings{alvarez-efficient10,
  title = 	 {Efficient Multioutput Gaussian Processes through Variational Inducing Kernels},
  author = 	 {Mauricio A. Álvarez and David Luengo and Michalis K. Titsias and Neil D. Lawrence},
  booktitle = 	 {Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics},
  pages = 	 {25},
  year = 	 {2010},
  editor = 	 {Yee Whye Teh and D. Michael Titterington},
  volume = 	 {9},
  address = 	 {Chia Laguna Resort, Sardinia, Italy},
  month = 	 {00},
  publisher = 	 {JMLR W\&CP 9},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2010-01-01-alvarez-efficient10.md},
  url =  	 {http://inverseprobability.com/publications/alvarez-efficient10.html},
  abstract = 	 {Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.},
  crossref =  {Teh:aistats10},
  key = 	 {Alvarez:efficient10},
  linkpdf = 	 {http://jmlr.csail.mit.edu/proceedings/papers/v9/alvarez10a/alvarez10a.pdf},
  linksoftware = {https://github.com/SheffieldML/multigp},
  OPTgroup = 	 {}
 

}
%T Efficient Multioutput Gaussian Processes through Variational Inducing Kernels
%A Mauricio A. Álvarez and David Luengo and Michalis K. Titsias and Neil D. Lawrence
%B 
%C Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics
%D 
%E Yee Whye Teh and D. Michael Titterington
%F alvarez-efficient10
%I JMLR W\&CP 9	
%P 25--32
%R 
%U http://inverseprobability.com/publications/alvarez-efficient10.html
%V 9
%X Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.
TY  - CPAPER
TI  - Efficient Multioutput Gaussian Processes through Variational Inducing Kernels
AU  - Mauricio A. Álvarez
AU  - David Luengo
AU  - Michalis K. Titsias
AU  - Neil D. Lawrence
BT  - Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics
PY  - 2010/01/01
DA  - 2010/01/01
ED  - Yee Whye Teh
ED  - D. Michael Titterington	
ID  - alvarez-efficient10
PB  - JMLR W\&CP 9	
SP  - 25
EP  - 32
L1  - http://jmlr.csail.mit.edu/proceedings/papers/v9/alvarez10a/alvarez10a.pdf
UR  - http://inverseprobability.com/publications/alvarez-efficient10.html
AB  - Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.
ER  -

Álvarez, M.A., Luengo, D., Titsias, M.K. & Lawrence, N.D.. (2010). Efficient Multioutput Gaussian Processes through Variational Inducing Kernels. Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics 9:25-32