COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Statistics > Posterior contraction rates for potentially nonlinear inverse problems
Posterior contraction rates for potentially nonlinear inverse problemsAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Dr Sergio Bacallado. We will consider a family of potentially nonlinear inverse problems subject to Gaussian additive white noise. We will assume truncated Gaussian priors and our interest will be in studying the asymptotic performance of the Bayesian posterior in the small noise limit. In particular, we will develop a theory for obtaining posterior contraction rates. The theory is based on the techniques of Knapik and Salomond 2018, which show how to derive posterior contraction rates for inverse problems, using rates of contraction for direct problems and the notion of the modulus of continuity. We will work under the assumption that the forward operator can be associated to a linear operator in a certain sense. We will present techniques from regularization theory, which allow both to bound the modulus of continuity, as well as to derive optimal rates of contraction for the direct problem by appropriately tuning the prior-truncation level. Finally, we will combine to obtain optimal rates of contraction for a range of inverse problems. This is joint work with Peter Mathé (Weierstrass Institute, Berlin) This talk is part of the Statistics series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsCritical Theory and Practice Seminar Biotechnology Seminar Series Department of Public Health and Primary CareOther talksBlowup for the supercritical cubic wave equation Seeing the invisible; the Dark Matter puzzle. Assessing spatial risk of Anthrax in Uganda using ecological niche modeling Fiscal reform in Britain and Germany since 1945 Exploiting Sparsity in Semidefinite and Sum of Squares Programming Sequential Monte Carlo and deep regression |