Bayesian learning via neural Schrödinger–Föllmer flows

Francisco Vargas, Andrius Ovsianas, David Fernandes, Mark Girolami, Neil D. Lawrence, Nikolas Nüsken
, 33(3), 2022.

Abstract

In this work we explore a new framework for approximate Bayesian inference in large datasets based on stochastic control. We advocate stochastic control as a finite time and low variance alternative to popular steady-state methods such as stochastic gradient Langevin dynamics. Furthermore, we discuss and adapt the existing theoretical guarantees of this framework and establish connections to already existing VI routines in SDE-based models.

Cite this Paper


BibTeX
@Article{bayesian-learning-via-schroedinger-follmer-flows, title = {Bayesian learning via neural Schrödinger–Föllmer flows}, author = {Vargas, Francisco and Ovsianas, Andrius and Fernandes, David and Girolami, Mark and Lawrence, Neil D. and Nüsken, Nikolas}, year = {2022}, volume = {33}, number = {3}, doi = {10.1007/s11222-022-10172-5}, pdf = {https://trebuchet.public.springernature.app/get_content/f9636384-d26a-4b40-bd96-3adb08e2c867}, url = {http://inverseprobability.com/publications/bayesian-learning-via-schroedinger-follmer-flows.html}, abstract = {In this work we explore a new framework for approximate Bayesian inference in large datasets based on stochastic control. We advocate stochastic control as a finite time and low variance alternative to popular steady-state methods such as stochastic gradient Langevin dynamics. Furthermore, we discuss and adapt the existing theoretical guarantees of this framework and establish connections to already existing VI routines in SDE-based models.} }
Endnote
%0 Journal Article %T Bayesian learning via neural Schrödinger–Föllmer flows %A Francisco Vargas %A Andrius Ovsianas %A David Fernandes %A Mark Girolami %A Neil D. Lawrence %A Nikolas Nüsken %D 2022 %F bayesian-learning-via-schroedinger-follmer-flows %R 10.1007/s11222-022-10172-5 %U http://inverseprobability.com/publications/bayesian-learning-via-schroedinger-follmer-flows.html %V 33 %N 3 %X In this work we explore a new framework for approximate Bayesian inference in large datasets based on stochastic control. We advocate stochastic control as a finite time and low variance alternative to popular steady-state methods such as stochastic gradient Langevin dynamics. Furthermore, we discuss and adapt the existing theoretical guarantees of this framework and establish connections to already existing VI routines in SDE-based models.
RIS
TY - JOUR TI - Bayesian learning via neural Schrödinger–Föllmer flows AU - Francisco Vargas AU - Andrius Ovsianas AU - David Fernandes AU - Mark Girolami AU - Neil D. Lawrence AU - Nikolas Nüsken BT - Statistics and Computing DA - 2022/11/23 ID - bayesian-learning-via-schroedinger-follmer-flows VL - 33 IS - 3 DO - 10.1007/s11222-022-10172-5 L1 - https://trebuchet.public.springernature.app/get_content/f9636384-d26a-4b40-bd96-3adb08e2c867 UR - http://inverseprobability.com/publications/bayesian-learning-via-schroedinger-follmer-flows.html AB - In this work we explore a new framework for approximate Bayesian inference in large datasets based on stochastic control. We advocate stochastic control as a finite time and low variance alternative to popular steady-state methods such as stochastic gradient Langevin dynamics. Furthermore, we discuss and adapt the existing theoretical guarantees of this framework and establish connections to already existing VI routines in SDE-based models. ER -
APA
Vargas, F., Ovsianas, A., Fernandes, D., Girolami, M., Lawrence, N.D. & Nüsken, N.. (2022). Bayesian learning via neural Schrödinger–Föllmer flows. 33(3) doi:10.1007/s11222-022-10172-5 Available from http://inverseprobability.com/publications/bayesian-learning-via-schroedinger-follmer-flows.html.

Related Material