University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > A scalable diffusion posterior sampling in nonparametric Bayesian linear inverse problems

A scalable diffusion posterior sampling in nonparametric Bayesian linear inverse problems

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact nobody.

RCLW03 - Accelerating statistical inference and experimental design with machine learning

Score-based diffusion models (SDMs) have emerged as a powerful tool for sampling from the posterior distribution in Bayesian inverse problems. Existing methods however often require multiple evaluations of the forward map to generate a single sample, resulting in significant computational costs for large-scale inverse problems. To address this limitation, we propose a scalable diffusion posterior sampling (SDPS) method tailored to linear nonparametric inverse problems, which avoids forward model evaluations during sampling by shifting computational effort to an offline training phase. In this phase, a task-dependent score function is learned based on the linear forward operator. Crucially, the conditional posterior score is derived exactly from the trained score using affine transformations, eliminating the need for conditional score approximations. Our approach is shown to work in infinite-dimensional diffusion models and is supported by rigorous convergence analysis. We validate SDPS through high-dimensional computed tomography (CT) and image deblurring experiments. Based on joint work with Fabian Schneider, Matti Lassas, Maarten V. de Hoop, and Tapio Helin.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity