Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes

Zhenwen DaiMauricio A. ÁlvarezNeil D. Lawrence
Advances in Neural Information Processing Systems, Curran Associates, Inc. 30:5131-5139, 2017.

Abstract

Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) and that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP, of which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data.

Cite this Paper


BibTeX
@InProceedings{Dai:supervised17, title = {Efficient Modeling of Latent Information in Supervised Learning using {G}aussian Processes}, author = {Dai, Zhenwen and Álvarez, Mauricio A. and Lawrence, Neil D.}, booktitle = {Advances in Neural Information Processing Systems}, pages = {5131--5139}, year = {2017}, volume = {30}, address = {Longbeach, Californa}, publisher = {Curran Associates, Inc.}, pdf = {https://proceedings.neurips.cc/paper/2017/file/1680e9fa7b4dd5d62ece800239bb53bd-Paper.pdf}, url = {http://inverseprobability.com/publications/efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes.html}, abstract = {Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) and that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP, of which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data. } }
Endnote
%0 Conference Paper %T Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes %A Zhenwen Dai %A Mauricio A. Álvarez %A Neil D. Lawrence %B Advances in Neural Information Processing Systems %D 2017 %F Dai:supervised17 %I Curran Associates, Inc. %P 5131--5139 %U http://inverseprobability.com/publications/efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes.html %V 30 %X Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) and that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP, of which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data.
RIS
TY - CPAPER TI - Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes AU - Zhenwen Dai AU - Mauricio A. Álvarez AU - Neil D. Lawrence BT - Advances in Neural Information Processing Systems DA - 2017/12/05 ID - Dai:supervised17 PB - Curran Associates, Inc. VL - 30 SP - 5131 EP - 5139 L1 - https://proceedings.neurips.cc/paper/2017/file/1680e9fa7b4dd5d62ece800239bb53bd-Paper.pdf UR - http://inverseprobability.com/publications/efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes.html AB - Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) and that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP, of which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data. ER -
APA
Dai, Z., Álvarez, M.A. & Lawrence, N.D.. (2017). Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes. Advances in Neural Information Processing Systems 30:5131-5139 Available from http://inverseprobability.com/publications/efficient-modelling-of-latent-information-in-supervised-learning-using-gaussian-processes.html.

Related Material