University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Stochastic discrete integration

Stochastic discrete integration

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Yingzhen Li.

We will focus most of our effort on understanding the recently proposed WISH algorithm of Ermon, Selman, Gomes and Sabharwal for approximating partition functions and then explain recent extensions of the original work as well as connections to coding theory. The WISH algorithm is a randomized algorithm that, with high probability, gives a constant-factor approximation of a general discrete integral defined over an exponentially large set. WISH estimates partition functions by piecing together MAP solutions to a small number of discrete combinatorial optimization problems subject to randomly generated parity constraints (effectively converting a summation problem into a constrained optimization problem).

The talk will assume no prior background but familiarity with the basics of graphical models (i.e. what is an Ising model) and things like basic concentration inequalities (Markov’s Inequality, etc…)/union bounds will be useful; some of these will be reviewed as necessary.

Relevant Papers (you don’t need to read before coming):

S. Ermon, C. Gomes, A. Sabharwal, B. Selman, Taming the Curse of Dimensionality: Discrete Integration by Hashing and Optimization. ICML 2013 .

D. Achlioptas, P. Jiang, Stochastic Integration via Error-Correcting Codes. UAI 2015 .

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity