University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > An Overview of Normalizing Flows

An Overview of Normalizing Flows

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Robert Pinsler.

Normalizing flows have become extremely popular of late due to them having high capacity and supporting exact density evaluation. This has made flows a state-of-the-art tool for variational inference and generative modeling, with DeepMind’s WaveNet being the most notable success story. In turn, the research literature has been inundated with flow variants, making it hard to extract trends in this subfield. In this meeting of the reading group, we attempt to organize the recent developments in normalizing flows. Beginning with discrete flows, we describe how early models such as NICE have been gradually endowed with more capacity, resulting in SOTA generators such as OpenAI’s Glow. We also summarize the various masking procedures essential to defining autoregressive flows, including Parallel WaveNet. Lastly, we highlight the burgeoning field of continuous-time flows and numerical methods for their training and evaluation.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity