A scalable diffusion posterior sampling in nonparametric Bayesian linear inverse problems
- đ¤ Speaker: Duc-Lam Duong (LUT University)
- đ Date & Time: Thursday 26 June 2025, 12:15 - 12:45
- đ Venue: Seminar Room 1, Newton Institute
Abstract
Score-based diffusion models (SDMs) have emerged as a powerful tool for sampling from the posterior distribution in Bayesian inverse problems. Existing methods however often require multiple evaluations of the forward map to generate a single sample, resulting in significant computational costs for large-scale inverse problems. To address this limitation, we propose a scalable diffusion posterior sampling (SDPS) method tailored to linear nonparametric inverse problems, which avoids forward model evaluations during sampling by shifting computational effort to an offline training phase. In this phase, a task-dependent score function is learned based on the linear forward operator. Crucially, the conditional posterior score is derived exactly from the trained score using affine transformations, eliminating the need for conditional score approximations. Our approach is shown to work in infinite-dimensional diffusion models and is supported by rigorous convergence analysis. We validate SDPS through high-dimensional computed tomography (CT) and image deblurring experiments. Based on joint work with Fabian Schneider, Matti Lassas, Maarten V. de Hoop, and Tapio Helin.
Series This talk is part of the Isaac Newton Institute Seminar Series series.
Included in Lists
- All CMS events
- bld31
- dh539
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 1, Newton Institute
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Duc-Lam Duong (LUT University)
Thursday 26 June 2025, 12:15-12:45