University of Cambridge > > Machine Learning Reading Group @ CUED > Schrödinger bridges, diffusion and SDEs

Schrödinger bridges, diffusion and SDEs

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Isaac Reid.

Zoom link available upon request (it is sent out on our mailing list, eng-mlg-rcc [at] Sign up to our mailing list for easier reminders via

In this talk, we cover the mathematical groundwork for stochastic differential equations (SDEs), introducing Ito and Stratanovich calculus. We then talk about how SDEs serve as a natural framework for unifying many generative modelling techniques, such as score-matching, denoising diffusion models, and conditional flows. Finally, we discuss some theoretical convergence results for SDEs, and show improvements to generative modelling through the Schrodinger Bridge formulation, which improves upon convergence in SDEs by imposing boundary conditions. Finally, we discuss some empirical techniques for solving Schrodinger Bridge problems for generative modelling.

References: [1] Song, Yang, et al. “Score-based generative modeling through stochastic differential equations.” arXiv preprint arXiv:2011.13456 (2020). [2] De Bortoli, Valentin, et al. “Diffusion Schrödinger bridge with applications to score-based generative modeling.” Advances in Neural Information Processing Systems 34 (2021): 17695-17709.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity