Efficient Multioutput Gaussian Processes through Variational Inducing Kernels

Mauricio A. ÁlvarezDavid LuengoMichalis K. TitsiasNeil D. Lawrence
Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics, JMLR W\&CP 9 9:25-32, 2010.

Abstract

Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.

Cite this Paper


BibTeX
@InProceedings{Alvarez:efficient10, title = {Efficient Multioutput Gaussian Processes through Variational Inducing Kernels}, author = {Mauricio A. Álvarez and David Luengo and Michalis K. Titsias and Neil D. Lawrence}, booktitle = {Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics}, pages = {25--32}, year = {2010}, editor = {Yee Whye Teh and D. Michael Titterington}, volume = {9}, address = {Chia Laguna Resort, Sardinia, Italy}, publisher = {JMLR W\&CP 9}, abstract = {Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.} }
Endnote
%0 Conference Paper %T Efficient Multioutput Gaussian Processes through Variational Inducing Kernels %A Mauricio A. Álvarez %A David Luengo %A Michalis K. Titsias %A Neil D. Lawrence %B Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics %D 2010 %E Yee Whye Teh %E D. Michael Titterington %F Alvarez:efficient10 %I JMLR W\&CP 9 %P 25--32 %V 9 %X Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.
RIS
TY - CPAPER TI - Efficient Multioutput Gaussian Processes through Variational Inducing Kernels AU - Mauricio A. Álvarez AU - David Luengo AU - Michalis K. Titsias AU - Neil D. Lawrence BT - Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics DA - 2010/01/01 ED - Yee Whye Teh ED - D. Michael Titterington ID - Alvarez:efficient10 PB - JMLR W\&CP 9 VL - 9 SP - 25 EP - 32 AB - Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series. ER -
APA
Álvarez, M.A., Luengo, D., Titsias, M.K. & Lawrence, N.D.. (2010). Efficient Multioutput Gaussian Processes through Variational Inducing Kernels. Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics 9:25-32

Related Material