Efficient Multioutput Gaussian Processes through Variational Inducing Kernels

Mauricio A. ÁlvarezDavid LuengoMichalis K. TitsiasNeil D. Lawrence
Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics, PMLR 9:25-32, 2010.

Abstract

Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.

Cite this Paper


BibTeX
@InProceedings{Alvarez:efficient10, title = {Efficient Multioutput {G}aussian Processes through Variational Inducing Kernels}, author = {Álvarez, Mauricio A. and Luengo, David and Titsias, Michalis K. and Lawrence, Neil D.}, booktitle = {Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics}, pages = {25--32}, year = {2010}, editor = {Teh, Yee Whye and Titterington, D. Michael}, volume = {9}, address = {Chia Laguna Resort, Sardinia, Italy}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/alvarez10a/alvarez10a.pdf}, url = {http://inverseprobability.com/publications/alvarez-efficient10.html}, abstract = {Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.} }
Endnote
%0 Conference Paper %T Efficient Multioutput Gaussian Processes through Variational Inducing Kernels %A Mauricio A. Álvarez %A David Luengo %A Michalis K. Titsias %A Neil D. Lawrence %B Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics %D 2010 %E Yee Whye Teh %E D. Michael Titterington %F Alvarez:efficient10 %I PMLR %P 25--32 %U http://inverseprobability.com/publications/alvarez-efficient10.html %V 9 %X Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.
RIS
TY - CPAPER TI - Efficient Multioutput Gaussian Processes through Variational Inducing Kernels AU - Mauricio A. Álvarez AU - David Luengo AU - Michalis K. Titsias AU - Neil D. Lawrence BT - Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - D. Michael Titterington ID - Alvarez:efficient10 PB - PMLR VL - 9 SP - 25 EP - 32 L1 - http://proceedings.mlr.press/v9/alvarez10a/alvarez10a.pdf UR - http://inverseprobability.com/publications/alvarez-efficient10.html AB - Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series. ER -
APA
Álvarez, M.A., Luengo, D., Titsias, M.K. & Lawrence, N.D.. (2010). Efficient Multioutput Gaussian Processes through Variational Inducing Kernels. Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics 9:25-32 Available from http://inverseprobability.com/publications/alvarez-efficient10.html.

Related Material