Nested Variational Compression in Deep Gaussian Processes

James HensmanNeil D. Lawrence
, 2014.

Abstract

Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either supervised or unsupervised learning. For tractable inference approximations to the marginal likelihood of the model must be made. The original approach to approximate inference in these models used variational compression to allow for approximate variational marginalization of the hidden variables leading to a lower bound on the marginal likelihood of the model [Damianou and Lawrence, 2013]. In this paper we extend this idea with a nested variational compression. The resulting lower bound on the likelihood can be easily parallelized or adapted for stochastic variational inference.

Cite this Paper


BibTeX
@Misc{Hensman-nested14, title = {Nested Variational Compression in Deep {G}aussian Processes}, author = {Hensman, James and Lawrence, Neil D.}, year = {2014}, pdf = {https://arxiv.org/pdf/1412.1370.pdf}, url = {http://inverseprobability.com/publications/hensman-nested14.html}, abstract = {Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either supervised or unsupervised learning. For tractable inference approximations to the marginal likelihood of the model must be made. The original approach to approximate inference in these models used variational compression to allow for approximate variational marginalization of the hidden variables leading to a lower bound on the marginal likelihood of the model [Damianou and Lawrence, 2013]. In this paper we extend this idea with a nested variational compression. The resulting lower bound on the likelihood can be easily parallelized or adapted for stochastic variational inference. } }
Endnote
%0 Generic %T Nested Variational Compression in Deep Gaussian Processes %A James Hensman %A Neil D. Lawrence %D 2014 %F Hensman-nested14 %U http://inverseprobability.com/publications/hensman-nested14.html %X Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either supervised or unsupervised learning. For tractable inference approximations to the marginal likelihood of the model must be made. The original approach to approximate inference in these models used variational compression to allow for approximate variational marginalization of the hidden variables leading to a lower bound on the marginal likelihood of the model [Damianou and Lawrence, 2013]. In this paper we extend this idea with a nested variational compression. The resulting lower bound on the likelihood can be easily parallelized or adapted for stochastic variational inference.
RIS
TY - GEN TI - Nested Variational Compression in Deep Gaussian Processes AU - James Hensman AU - Neil D. Lawrence DA - 2014/01/01 ID - Hensman-nested14 L1 - https://arxiv.org/pdf/1412.1370.pdf UR - http://inverseprobability.com/publications/hensman-nested14.html AB - Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either supervised or unsupervised learning. For tractable inference approximations to the marginal likelihood of the model must be made. The original approach to approximate inference in these models used variational compression to allow for approximate variational marginalization of the hidden variables leading to a lower bound on the marginal likelihood of the model [Damianou and Lawrence, 2013]. In this paper we extend this idea with a nested variational compression. The resulting lower bound on the likelihood can be easily parallelized or adapted for stochastic variational inference. ER -
APA
Hensman, J. & Lawrence, N.D.. (2014). Nested Variational Compression in Deep Gaussian Processes. Available from http://inverseprobability.com/publications/hensman-nested14.html.

Related Material