COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

## Learning from MOM's principlesAdd to your list(s) Download to your calendar using vCal - Guillaume LecuĂ© (ENSAE)
- Friday 17 November 2017, 16:00-17:00
- MR12.
If you have a question about this talk, please contact Quentin Berthet. (Joint work with Matthieu Lerasle) We obtain theoretical and practical performances for median of means estimators. From a theoretical point of view, estimation and prediction error bounds achieved by the MOM estimators hold with exponentially large probability—as in the gaussian framework with independent noise—under only weak moments assumptions on the data and without assuming independence between the noise and the design. Moreover, MOM procedures are robust since a large part of the data may have nothing to do with the oracle we want to reconstruct. Our general risk bound is of order max(minimax rate of convergence in the i.i.d. setup, (number of outliers)/number of observations)). In particular, the number of outliers may be as large as (number of data)*(minimax rate) without affecting the statistical properties of the MOM estimator. A regularization norm may also be used together with the MOM criterium. In that case, any norm can be used for regularization. When it has some sparsity inducing power we recover sparse rates of convergence and sparse oracle inequalities. For example, the minimax rate s log(d/s)/N of recovery of a s-sparse vector in R^d is achieved by a median-of-means version of the LASSO when the noise has q_0 moments for some q_0>2, the design matrix C_0\log(d) moments and the dataset is corrupted by s log(d/s) outliers. This result holds with exponentially large probability as if the noise and the design were i.i.d. Gaussian random variables. On the practical side, MOM estimators (and their associated regularized versions) can easily be implemented. Actually, most gradient descent algorithms used to implement (non-robust) estimators like the LASSO can easily be transformed into a robust one by using a MOM approach. This talk is part of the Statistics series. ## This talk is included in these lists:- All CMS events
- All Talks (aka the CURE list)
- CMS Events
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- DPMMS Lists
- DPMMS info aggregator
- DPMMS lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- MR12
- Machine Learning
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
- bld31
- custom
- rp587
Note that ex-directory lists are not shown. |
## Other listsGraduate Women's Network CU Labour Club: All Events CIPIL Seminar Series## Other talksNew approaches to old problems: controlling pathogenic protozoan parasites of poultry Mass Spectrometry Comparative perspectives on social inequalities in life and death: an interdisciplinary conference Foster Talk - CANCELLED - Redox Oscillations in the Circadian Clockwork Systems for Big Data Applications: Revolutionising personal computing Cambridge - Corporate Finance Theory Symposium September 2018 - Day 1 |