Efficient Multioutput Gaussian Processes through Variational Inducing Kernels

Mauricio A. ÁlvarezDavid LuengoMichalis K. TitsiasNeil D. Lawrence
,  9:25-32, 2010.

Abstract

Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-alvarez-efficient10, title = {Efficient Multioutput Gaussian Processes through Variational Inducing Kernels}, author = {Mauricio A. Álvarez and David Luengo and Michalis K. Titsias and Neil D. Lawrence}, pages = {25--32}, year = {}, editor = {}, volume = {9}, address = {Chia Laguna Resort, Sardinia, Italy}, url = {http://inverseprobability.com/publications/alvarez-efficient10.html}, abstract = {Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.} }
Endnote
%0 Conference Paper %T Efficient Multioutput Gaussian Processes through Variational Inducing Kernels %A Mauricio A. Álvarez %A David Luengo %A Michalis K. Titsias %A Neil D. Lawrence %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-alvarez-efficient10 %I PMLR %J Proceedings of Machine Learning Research %P 25--32 %U http://inverseprobability.com %V %W PMLR %X Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.
RIS
TY - CPAPER TI - Efficient Multioutput Gaussian Processes through Variational Inducing Kernels AU - Mauricio A. Álvarez AU - David Luengo AU - Michalis K. Titsias AU - Neil D. Lawrence BT - PY - DA - ED - ID - pmlr-v-alvarez-efficient10 PB - PMLR SP - 25 DP - PMLR EP - 32 L1 - UR - http://inverseprobability.com/publications/alvarez-efficient10.html AB - Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series. ER -
APA
Álvarez, M.A., Luengo, D., Titsias, M.K. & Lawrence, N.D.. (). Efficient Multioutput Gaussian Processes through Variational Inducing Kernels. , in PMLR :25-32

Related Material