Variational Inducing Kernels for Sparse Convolved Multiple Output Gaussian Processes

Mauricio A. ÁlvarezDavid LuengoMichalis K. TitsiasNeil D. Lawrence
, 2009.

Abstract

Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence (2009) recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-alvarez-viktech09, title = {Variational Inducing Kernels for Sparse Convolved Multiple Output Gaussian Processes}, author = {Mauricio A. Álvarez and David Luengo and Michalis K. Titsias and Neil D. Lawrence}, year = {}, editor = {}, url = {http://inverseprobability.com/publications/alvarez-viktech09.html}, abstract = {Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence (2009) recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.} }
Endnote
%0 Conference Paper %T Variational Inducing Kernels for Sparse Convolved Multiple Output Gaussian Processes %A Mauricio A. Álvarez %A David Luengo %A Michalis K. Titsias %A Neil D. Lawrence %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-alvarez-viktech09 %I PMLR %J Proceedings of Machine Learning Research %P -- %U http://inverseprobability.com %V %W PMLR %X Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence (2009) recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.
RIS
TY - CPAPER TI - Variational Inducing Kernels for Sparse Convolved Multiple Output Gaussian Processes AU - Mauricio A. Álvarez AU - David Luengo AU - Michalis K. Titsias AU - Neil D. Lawrence BT - PY - DA - ED - ID - pmlr-v-alvarez-viktech09 PB - PMLR SP - DP - PMLR EP - L1 - UR - http://inverseprobability.com/publications/alvarez-viktech09.html AB - Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence (2009) recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series. ER -
APA
Álvarez, M.A., Luengo, D., Titsias, M.K. & Lawrence, N.D.. (). Variational Inducing Kernels for Sparse Convolved Multiple Output Gaussian Processes. , in PMLR :-

Related Material