Variational Bayes for high-dimensional linear regression with sparse priors
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Randolf Altmeyer.
A core problem in Bayesian statistics is approximating difficult to compute posterior distributions. In variational Bayes (VB), a method from machine learning, one approximates the posterior through optimization, which is typically faster than Markov chain Monte Carlo. We study a mean-field (i.e. factorizable) VB approximation to Bayesian model selection priors, including the popular spike-and-slab prior, in sparse high-dimensional linear regression. We establish convergence rates for this VB approach, studying conditions under which it provides good estimation. We also discuss some computational issues and study the empirical performance of the algorithm.
This talk is part of the CCIMI Seminars series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|