Gaussian process models are flexible, Bayesian non-parametric approaches to regression. Properties of multivariate Gaussians mean that they can be combined linearly in the manner of additive models and via a link function (like in generalized linear models) to handle non-Gaussian data. However, the link function formalism is restrictive, link functions are always invertible and must convert a parameter of interest to an linear combination of the underlying processes. There are many likelihoods and models where a non-linear combination is more appropriate. We term these more general models “Chained Gaussian Processes”: the transformation of the GPs to the likelihood parameters will not generally be invertible, and that implies that linearisation would only be possible with multiple (localized) links, i.e a chain. We develop an approximate inference procedure for Chained GPs that is scalable and applicable to any factorized likelihood. We demonstrate the approximation on a range of likelihood functions.

@InProceedings{saul-chained16,
title = {Chained Gaussian Processes},
author = {Alan D. Saul and James Hensman and Aki Vehtari and Neil D. Lawrence},
booktitle = {Proceedings of the Nineteenth International Workshop on Artificial Intelligence and Statistics},
pages = {1431},
year = {2016},
editor = {Arthur Gretton and Cristian Robert},
volume = {51},
address = {Cadiz, Spain},
month = {00},
publisher = {JMLR W\&CP 51},
edit = {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2016-01-01-saul-chained16.md},
url = {http://inverseprobability.com/publications/saul-chained16.html},
abstract = {Gaussian process models are flexible, Bayesian non-parametric approaches to regression. Properties of multivariate Gaussians mean that they can be combined linearly in the manner of additive models and via a link function (like in generalized linear models) to handle non-Gaussian data. However, the link function formalism is restrictive, link functions are always invertible and must convert a parameter of interest to an linear combination of the underlying processes. There are many likelihoods and models where a non-linear combination is more appropriate. We term these more general models “Chained Gaussian Processes”: the transformation of the GPs to the likelihood parameters will not generally be invertible, and that implies that linearisation would only be possible with multiple (localized) links, i.e a chain. We develop an approximate inference procedure for Chained GPs that is scalable and applicable to any factorized likelihood. We demonstrate the approximation on a range of likelihood functions.},
crossref = {Gretton:aistats16},
key = {Saul:chained16},
linkpdf = {http://jmlr.org/proceedings/papers/v51/saul16.pdf},
OPTgroup = {}
}

%T Chained Gaussian Processes
%A Alan D. Saul and James Hensman and Aki Vehtari and Neil D. Lawrence
%B
%C Proceedings of the Nineteenth International Workshop on Artificial Intelligence and Statistics
%D
%E Arthur Gretton and Cristian Robert
%F saul-chained16
%I JMLR W\&CP 51
%P 1431--1440
%R
%U http://inverseprobability.com/publications/saul-chained16.html
%V 51
%X Gaussian process models are flexible, Bayesian non-parametric approaches to regression. Properties of multivariate Gaussians mean that they can be combined linearly in the manner of additive models and via a link function (like in generalized linear models) to handle non-Gaussian data. However, the link function formalism is restrictive, link functions are always invertible and must convert a parameter of interest to an linear combination of the underlying processes. There are many likelihoods and models where a non-linear combination is more appropriate. We term these more general models “Chained Gaussian Processes”: the transformation of the GPs to the likelihood parameters will not generally be invertible, and that implies that linearisation would only be possible with multiple (localized) links, i.e a chain. We develop an approximate inference procedure for Chained GPs that is scalable and applicable to any factorized likelihood. We demonstrate the approximation on a range of likelihood functions.

TY - CPAPER
TI - Chained Gaussian Processes
AU - Alan D. Saul
AU - James Hensman
AU - Aki Vehtari
AU - Neil D. Lawrence
BT - Proceedings of the Nineteenth International Workshop on Artificial Intelligence and Statistics
PY - 2016/01/01
DA - 2016/01/01
ED - Arthur Gretton
ED - Cristian Robert
ID - saul-chained16
PB - JMLR W\&CP 51
SP - 1431
EP - 1440
L1 - http://jmlr.org/proceedings/papers/v51/saul16.pdf
UR - http://inverseprobability.com/publications/saul-chained16.html
AB - Gaussian process models are flexible, Bayesian non-parametric approaches to regression. Properties of multivariate Gaussians mean that they can be combined linearly in the manner of additive models and via a link function (like in generalized linear models) to handle non-Gaussian data. However, the link function formalism is restrictive, link functions are always invertible and must convert a parameter of interest to an linear combination of the underlying processes. There are many likelihoods and models where a non-linear combination is more appropriate. We term these more general models “Chained Gaussian Processes”: the transformation of the GPs to the likelihood parameters will not generally be invertible, and that implies that linearisation would only be possible with multiple (localized) links, i.e a chain. We develop an approximate inference procedure for Chained GPs that is scalable and applicable to any factorized likelihood. We demonstrate the approximation on a range of likelihood functions.
ER -

Saul, A.D., Hensman, J., Vehtari, A. & Lawrence, N.D.. (2016). Chained Gaussian Processes. Proceedings of the Nineteenth International Workshop on Artificial Intelligence and Statistics 51:1431-1440