Semi-described and Semi-supervised Learning with Gaussian Processes

Andreas DamianouNeil D. Lawrence
31st Conference on Uncertainty in Artificial Intelligence (UAI), 2015.

Abstract

Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as 'semi-described learning'. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks.

Cite this Paper


BibTeX
@InProceedings{Damianou:semi15, title = {Semi-described and Semi-supervised Learning with {G}aussian Processes}, author = {Andreas Damianou and Neil D. Lawrence}, booktitle = {31st Conference on Uncertainty in Artificial Intelligence (UAI)}, year = {2015}, abstract = {Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as 'semi-described learning'. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks.} }
Endnote
%0 Conference Paper %T Semi-described and Semi-supervised Learning with Gaussian Processes %A Andreas Damianou %A Neil D. Lawrence %B 31st Conference on Uncertainty in Artificial Intelligence (UAI) %D 2015 %F Damianou:semi15 %X Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as 'semi-described learning'. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks.
RIS
TY - CPAPER TI - Semi-described and Semi-supervised Learning with Gaussian Processes AU - Andreas Damianou AU - Neil D. Lawrence BT - 31st Conference on Uncertainty in Artificial Intelligence (UAI) DA - 2015/01/01 ID - Damianou:semi15 AB - Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as 'semi-described learning'. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks. ER -
APA
Damianou, A. & Lawrence, N.D.. (2015). Semi-described and Semi-supervised Learning with Gaussian Processes. 31st Conference on Uncertainty in Artificial Intelligence (UAI)

Related Material