BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Statistics
SUMMARY:Sparsity pattern aggregation for convex stochastic
optimization. - Phillipe Rigollet (Princeton Univ
ersity)
DTSTART;TZID=Europe/London:20101029T160000
DTEND;TZID=Europe/London:20101029T170000
UID:TALK25760AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/25760
DESCRIPTION:Important statistical problems including regressio
n\, binary classification\nand density estimation
can be recast as convex stochastic optimization\np
roblems when seen from the point of view of statis
tical aggregation. These\nconvex problems can be n
umerically solved efficiently in high dimension bu
t\nmay show mediocre statistical performance. One
way to overcome this\nsituation consists in assumi
ng that there exists approximate solution\,\ncalle
d "sparse"\, that are of moderate dimension. This
presentation\nintroduces a new method called "expo
nential screening (ES)" as an\nalternative to the
$\\ell_1$-penalization idea\, which is currently t
he most\npopular way to find these sparse solution
s. While $\\ell_1$ based methods can\nbe analyzed
only under rather stringent assumptions\, ES shows
optimal\nstatistical performance under fairly gen
eral assumptions. Implementation is\nnot straightf
orward but it can be approximated using the Metrop
olis\nalgorithm which results in a stochastic gree
dy algorithm and performs\nsurprisingly well in a
simulated problem of sparse recovery.\n\n\nhttp://
www.princeton.edu/~rigollet/index.html
LOCATION:MR12\, CMS\, Wilberforce Road\, Cambridge\, CB3 0W
B
CONTACT:Richard Nickl
END:VEVENT
END:VCALENDAR