University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Topics in Expectation Propagation

Topics in Expectation Propagation

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Yingzhen Li.

Approximate inference is key to modern probabilistic modeling, since exact inference/learning is intractable for many models used in real-world applications. In this talk we present expectation propagation (EP) as a general framework for fast and accurate approximate inference. We give a wide range of applications of EP for both posterior approximation and marginal inference. We also show the flexibility of algorithm design within the EP framework by introducing factor graphs, approximate distribution families and projection operators. Finally we provide a justification of EP from a variational viewpoint and connect it to the Bethe free energy approximation.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity