[edit]
Efficient Multioutput Gaussian Processes through Variational Inducing Kernels
Proceedings of the Thirteenth International Workshop on Artificial Intelligence and Statistics, PMLR 9:25-32, 2010.
Abstract
Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence @Alvarez:convolved08 recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias @Titsias:variational09 to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.