COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Stein Points: Efficient sampling from posterior distributions by minimising Stein Discrepancies.
Stein Points: Efficient sampling from posterior distributions by minimising Stein Discrepancies.Add to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact INI IT. UNQ - Uncertainty quantification for complex systems: theory and methodologies An important task in computational statistics and machine learning is to approximate a posterior distribution with an empirical measure supported on a set of representative points. This work focuses on methods where the selection of points is essentially deterministic, with an emphasis on achieving accurate approximation when the number of points is small. To this end, we present `Stein Points'. The idea is to exploit either a greedy or a conditional gradient method to iteratively minimise a kernel Stein discrepancy between the empirical measure and the target measure. Our empirical results demonstrate that Stein Points enable accurate approximation of the posterior at modest computational cost. In addition, theoretical results are provided to establish convergence of the method. This talk is part of the Isaac Newton Institute Seminar Series series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsDenise Schofield Dominic Sandbrook: 'State of Emergency: Britain in the 1970s'Other talksPsychometric Scaling of TID2013 Dataset Mesembs - Actual and Digital Non-equilibrium turbulence scalings and self-similarity in turbulent planar jets Ramble through my greenhouse and Automation The Digital Person: A Symposium RevolutioN Trains |