University of Cambridge > > Machine Learning Reading Group @ CUED > Diffusion and Score-based Generative Models

Diffusion and Score-based Generative Models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Elre Oldewage.


Score-based generative models have recently shown impressive results in generating synthetic data from complex distributions — becoming a promising alternative to GANs for sampling photorealistic images. The key idea in these models is to reverse an MCMC chain in which a white noise sample is gradually denoised to obtain a sample from the target density. During training, the score functions (gradients of log probability density functions) on a large number of noise-perturbed data distributions is learned — hence, the name of the model. In this reading group, we cover the basics and advantages of score-based generative models, their connections to SDEs and the link with diffusion-based models.

Recommended reading

- Deep Unsupervised Learning using Nonequilibrium Thermodynamics. Jascha Sohl-Dickstein, Eric A. Weiss, Niru Maheswaranathan, Surya Ganguli. ICML 2015 . [Required]

- Denoising Diffusion Probabilistic Models. Jonathan Ho, Ajay Jain, Pieter Abbeel. NeurIPS 2020. [Optional] - Generative Modeling by Estimating Gradients of the Data Distribution. Yang Song, Stefano Ermon. NeurIPS 2019. [Required]

- Score-Based Generative Modeling through Stochastic Differential Equations, Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole. ICLR 2021 . [Optional]

Our reading groups are livestreamed via Zoom and recorded for our Youtube channel. The Zoom details are distributed via our weekly mailing list.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity