BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Isaac Newton Institute Seminar Series
SUMMARY:Compositional Features and Feedforward Neural Netw
orks for High Dimensional Problems - Wei Kang (Na
val Postgraduate School)
DTSTART;TZID=Europe/London:20211116T163000
DTEND;TZID=Europe/London:20211116T170000
UID:TALK165418AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/165418
DESCRIPTION:Deep learning has had many impressive empirical su
ccesses in science and industries. On the other ha
nd\, the lack of theoretical understanding of the
field has been a large barrier to the adoption of
the technology. In this talk\, I will discuss some
compositional features of high dimensional proble
ms and their mathematical properties that shed lig
ht on the question why deep learning works for hig
h dimensional problems. It is widely observed in s
cience and engineering that complicated and high d
imensional information input-output relations can
be represented as compositions of functions with l
ow input dimensions. Their compositional structure
s can be effectively represented using layered dir
ected acyclic graphs (layered DAGs). Based on the
layered DAG formulation\, an algebraic framework a
nd approximation theory are developed for composit
ional functions including neural networks. The the
ory leads to the proof of several complexity/appro
ximation error bounds of deep neural networks for
problems of regression and dynamical systems.
LOCATION:Seminar Room 1\, Newton Institute
CONTACT:
END:VEVENT
END:VCALENDAR