University of Cambridge > > Statistics > On the use of non-local priors for joint high-dimensional estimation and selection

On the use of non-local priors for joint high-dimensional estimation and selection

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact .

A main challenge in modern statistics is to devise strategies that are effective in high dimensions. Ideally one would like solutions which are on one hand parsimonious, i.e. help interpret the process that generated the data in an easy manner, but that at the same time yield accurate predictions. As has been well-documented in the literature there is a tension between these two competing goals, e.g. simpler explanatory models tend to result in higher prediction errors, and accurate predictive models tend to be more complex than one would ideally wish for.

We explore the extent to which these two goals can be reconciled, adopting a Bayesian framework and a novel formulation based on non-local priors (NLPs). This class of priors has been proven to lead to faster learning rates for model selection that are indispensable if one is to attain Bayesian consistency in high-dimensions. Because they induce extra parsimony in the solution, NLPs typically result in adequately simple explanatory models. Interestingly, we recently discovered that NLPs also result in improved shrinkage rates for parameter estimation that lead to highly accurate predictions in high dimensions, hence providing models that are at the same time promising for explanatory and predictive purposes. We will illustrate these issues by reviewing some of the relevant theory and show various practical examples, as well as propose strategies to deal efficiently with some of the main computational issues at stake.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2019, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity