Statistical Risk Characterization of Penalized Likelihood Procedures: An Information-Theoretic Determination
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact .
We review theory for the Minimum Description Length principle, penalized likelihood and its statistical risk. An information theoretic condition on a penalty pen(f) yields the conclusion that the optimizer of the penalized log likelihood criterion log 1/likelihood(f) + pen(f) has risk not more than the index of resolvability, corresponding to the accuracy of the optimizer of the expected value of the criterion. For the linear span of a dictionary of candidate terms, we develop the information theoretic validity of penalties based on the l_1 norm of the coefficients in regression and log-density estimation settings. New results are presented for Gaussian graphical models. This represents joint work with Xi Luo and Sabyasachi Chatterjee.
This talk is part of the Statistics series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|