University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Flow matching, stochastic interpolants and everything in between

Flow matching, stochastic interpolants and everything in between

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Isaac Reid.

Zoom link available upon request (it is sent out on our mailing list, eng-mlg-rcc [at] lists.cam.ac.uk). Sign up to our mailing list for easier reminders via lists.cam.ac.uk.

Flow matching is the latest development in deep generative modelling and has already been applied to numerous tasks including protein design, image inpainting etc. Flow matching brought continuous normalising flows to the front stage, showing that they can be trained ‘simulation-free’, as in without solving an ODE at each training step. These are closely related to diffusion models, yet yielding noiseless trajectories at sampling. We will introduce relevant core ideas, and discuss the advantage and inconvenience of noisy trajectories.

Suggested reading: 1. Lipman, Chen & Ben-Hamu et al. (2022) Flow Matching for Generative Modeling. URL : http://arxiv.org/abs/2210.02747. 2. Tong, Malkin & Huguet et al. (2023) Improving and Generalizing Flow-Based Generative Models With Minibatch Optimal Transport. URL : http://arxiv.org/abs/2302.00482. 3. Albergo, Boffi & Vanden-Eijnden (2023) Stochastic Interpolants: a Unifying Framework for Flows and Diffusions. URL : http://arxiv.org/abs/2303.08797

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity