University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Opening up the black box of score estimation

Opening up the black box of score estimation

Add to your list(s) Download to your calendar using vCal

  • UserSitan Chen (Harvard University)
  • ClockFriday 19 July 2024, 11:00-12:00
  • HouseExternal.

If you have a question about this talk, please contact nobody.

DMLW01 - International workshop on diffusions in machine learning: foundations, generative models, and optimisation

In recent years there has been significant interest in the theoretical foundations of diffusion generative modeling. One representative result in this line of work is that with an accurate estimate of the score function for the data distribution, one can approximately sample from virtually any bounded distribution in polynomial time. In this talk I will describe recent work on the missing piece left open by these works: when can we actually learn an accurate estimate of the score from data? I will focus on two vignettes: (1) learning Gaussian mixture models (GMMs), and (2) learning optimal estimators for compressed sensing.For (1), I will present an algorithm for score estimation based on piecewise polynomial regression, yielding the first quasipolynomial-time algorithm for learning general mixtures of Gaussians with polylogarithmically many components. For (2), I will give the first rigorous learning guarantee for algorithm unrolling, proving that a certain unrolled network, when trained on compressed sensing examples, learns to compete with Bayes approximate message passing.Based on joint works with Aayush Karan, Vasilis Kontonis, and Kulin Shah.  

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity