edit

Semi-described and Semi-supervised Learning with Gaussian Processes

31st Conference on Uncertainty in Artificial Intelligence (UAI), 2015.

Abstract

Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as ‘semi-described learning’. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks.

This site last compiled Fri, 06 Dec 2024 20:39:33 +0000
Github Account Copyright © Neil D. Lawrence 2024. All rights reserved.