Variationally Auto-Encoded Deep Gaussian Processes

Zhenwen DaiAndreas DamianouJavier GonzalezNeil D. Lawrence
,  3, 2016.

Abstract

We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-dai-variationally16, title = {Variationally Auto-Encoded Deep Gaussian Processes}, author = {Zhenwen Dai and Andreas Damianou and Javier Gonzalez and Neil D. Lawrence}, year = {}, editor = {}, volume = {3}, address = {Caribe Hotel, San Juan, PR}, url = {http://inverseprobability.com/publications/dai-variationally16.html}, abstract = {We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization.} }
Endnote
%0 Conference Paper %T Variationally Auto-Encoded Deep Gaussian Processes %A Zhenwen Dai %A Andreas Damianou %A Javier Gonzalez %A Neil D. Lawrence %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-dai-variationally16 %I PMLR %J Proceedings of Machine Learning Research %P -- %U http://inverseprobability.com %V %W PMLR %X We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization.
RIS
TY - CPAPER TI - Variationally Auto-Encoded Deep Gaussian Processes AU - Zhenwen Dai AU - Andreas Damianou AU - Javier Gonzalez AU - Neil D. Lawrence BT - PY - DA - ED - ID - pmlr-v-dai-variationally16 PB - PMLR SP - DP - PMLR EP - L1 - UR - http://inverseprobability.com/publications/dai-variationally16.html AB - We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization. ER -
APA
Dai, Z., Damianou, A., Gonzalez, J. & Lawrence, N.D.. (). Variationally Auto-Encoded Deep Gaussian Processes. , in PMLR :-

Related Material