# Semi-described and semi-supervised learning with Gaussian processes

Andreas Damianou, University of Sheffield
Neil D. Lawrence, University of Sheffield

in 31st Conference on Uncertainty in Artificial Intelligence (UAI)

#### Abstract

Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as “semi-described learning”. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks.

  @InProceedings{damianou-semi15, title = {Semi-described and semi-supervised learning with {G}aussian processes}, author = {Andreas Damianou and Neil D. Lawrence}, booktitle = {31st Conference on Uncertainty in Artificial Intelligence (UAI)}, year = {2015}, month = {00}, edit = {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2015-01-01-damianou-semi15.md}, url = {http://inverseprobability.com/publications/damianou-semi15.html}, abstract = {Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as “semi-described learning”. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks.}, key = {Damianou:semi15}, linkpdf = {http://arxiv.org/pdf/1509.01168v1.pdf}, OPTgroup = {} }
 %T Semi-described and semi-supervised learning with Gaussian processes %A Andreas Damianou and Neil D. Lawrence %B %C 31st Conference on Uncertainty in Artificial Intelligence (UAI) %D %F damianou-semi15 %P -- %R %U http://inverseprobability.com/publications/damianou-semi15.html %X Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as “semi-described learning”. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks. 
 TY - CPAPER TI - Semi-described and semi-supervised learning with Gaussian processes AU - Andreas Damianou AU - Neil D. Lawrence BT - 31st Conference on Uncertainty in Artificial Intelligence (UAI) PY - 2015/07/12 DA - 2015/07/12 ID - damianou-semi15 SP - EP - L1 - http://arxiv.org/pdf/1509.01168v1.pdf UR - http://inverseprobability.com/publications/damianou-semi15.html AB - Propagating input uncertainty through non-linear Gaussian process (GP) mappings is intractable. This hinders the task of training GPs using uncertain and partially observed inputs. In this paper we refer to this task as “semi-described learning”. We then introduce a GP framework that solves both, the semi-described and the semi-supervised learning problems (where missing values occur in the outputs). Auto-regressive state space simulation is also recognised as a special case of semi-described learning. To achieve our goal we develop variational methods for handling semi-described inputs in GPs, and couple them with algorithms that allow for imputing the missing values while treating the uncertainty in a principled, Bayesian manner. Extensive experiments on simulated and real-world data study the problems of iterative forecasting and regression/classification with missing values. The results suggest that the principled propagation of uncertainty stemming from our framework can significantly improve performance in these tasks. ER - 
 Damianou, A. & Lawrence, N.D.. (2015). Semi-described and semi-supervised learning with Gaussian processes. 31st Conference on Uncertainty in Artificial Intelligence (UAI) :-