University of Cambridge > Talks.cam > Statistics > Improved Nonparametric Empirical Bayes Estimation using Transfer Learning

Improved Nonparametric Empirical Bayes Estimation using Transfer Learning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Dr Sergio Bacallado.

We consider the problem of estimating a multivariate normal mean in the presence of possibly useful auxiliary variables. The traditional nonparametric empirical Bayes (NEB) framework provides an elegant interface to pool information across dimensions and facilitates the construction of effective shrinkage estimators. Such estimators can be further improved by incorporating pertinent information from the auxiliary variables. However, detecting and assimilating possibly useful information from auxiliary variables to shrinkage estimators is difficult. Here, we develop a new methodology that can transfer useful information from multiple auxiliary variables and yield improved Tweedie-type NEB estimators. Our method uses convex optimization to directly estimate the gradient of the log-density through an embedding in the reproducing kernel Hilbert space induced by the Stein’s discrepancy metric. We establish asymptotic optimality of the resultant estimator. We precisely tabulate the improvements in the estimation error as well as the deterioration in the learning rate as we inspect an increasing number of auxiliary variables. We demonstrate the competitive optimality of our method over existing NEB approaches through simulation experiments and in real data settings. This is joint work with Jiajun Luo and Wenguang Sun.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity