edit

Variationally Auto-Encoded Deep Gaussian Processes

Proceedings of the International Conference on Learning Representations, 3, 2016.

Abstract

We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization.

This site last compiled Fri, 06 Dec 2024 20:39:33 +0000
Github Account Copyright © Neil D. Lawrence 2024. All rights reserved.