BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Machine Learning @ CUED
SUMMARY:Deep Neural Networks: A Nonparametric Bayesian App
roach with Local Competition - Konstantinos P. Pan
ousis
DTSTART;TZID=Europe/London:20190620T110000
DTEND;TZID=Europe/London:20190620T120000
UID:TALK125806AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/125806
DESCRIPTION:The aim of this work is to enable inference of dee
p networks that retain high accuracy for the least
possible model complexity\, with the latter deduc
ed from the data during inference. To this end\, w
e revisit deep networks that comprise competing li
near units\, as opposed to nonlinear units that do
not entail any form of (local) competition. In th
is context\, our main technical innovation consist
s in an inferential setup that leverages solid arg
uments from Bayesian nonparametrics. We infer both
the needed set of connections or locally competin
g sets of units\, as well as the required floating
point precision for storing the network parameter
s. Specifically\, we introduce auxiliary discrete
latent variables representing which initial networ
k components are actually needed for modeling the
data at hand\, and perform Bayesian inference over
them by imposing appropriate stick-breaking prior
s. As we experimentally show using benchmark\ndata
sets\, our approach yields networks with less comp
utational footprint than the state-of-the-art\, an
d with no compromises in predictive accuracy.
LOCATION:Engineering Department\, CBL Room BE-438.
CONTACT:Robert Peharz
END:VEVENT
END:VCALENDAR