BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:CCIMI Seminars
SUMMARY:Stochastic variants of classical optimization meth
ods\, with complexity guarantees - Professor Coral
ia Cartis
DTSTART;TZID=Europe/London:20190501T140000
DTEND;TZID=Europe/London:20190501T150000
UID:TALK114850AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/114850
DESCRIPTION:Optimization is a key component of machine learnin
g application\, as it helps with training of (neur
al net\, nonconvex) models and parameter tuning. C
lassical optimization methods are challenged by th
e scale of machine learning applications and the l
ack of /cost of full derivatives\, as well as the
stochastic nature of the problem. On the other han
d\, the simple approaches that the machine learnin
g community uses need improvement. Here we try to
merge the two perspectives and adapt the strength
of classical optimization techniques to meet the c
hallenges of data science applications: from deter
ministic to stochastic problems\, from typical to
large scale. We propose a general algorithmic fram
ework and complexity analysis that allows the use
of inexact\, stochastic and even possibly biased\,
problem information in classical methods for nonc
onvex optimization. This work is joint with Katya
Scheinberg (Cornell)\, Jose Blanchet (Columbia) an
d Matt Menickelly (Argonne).
LOCATION:CMS\, MR14
CONTACT:J.W.Stevens
END:VEVENT
END:VCALENDAR