[edit]
Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes
Advances in Neural Information Processing Systems, Curran Associates, Inc. 30:5131-5139, 2017.
Abstract
Often in machine learning, data are collected as a combination of
multiple conditions, e.g., the voice recordings of multiple persons,
each labeled with an ID. How could we build a model that captures
the latent information related to these conditions and generalize to
a new one with few data? We present a new model called Latent
Variable Multiple Output Gaussian Processes (LVMOGP) and that allows
to jointly model multiple conditions for regression and generalize
to a new condition with a few data points at test time. LVMOGP
infers the posteriors of Gaussian processes together with a latent
space representing the information about different conditions. We
derive an efficient variational inference method for LVMOGP, of
which the computational complexity is as low as sparse Gaussian
processes. We show that LVMOGP significantly outperforms related
Gaussian process methods on various tasks with both synthetic and
real data.