BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Evaluating Deep Generative Models on Out-of-Distribution Inputs - 
 Eric Nalisnick (University of Cambridge)
DTSTART:20200306T120000Z
DTEND:20200306T130000Z
UID:TALK133057@talks.cam.ac.uk
CONTACT:James Thorne
DESCRIPTION: Generative models are widely believed to be more robust to ou
 t-of-training-distribution inputs than conditional (i.e. predictive) model
 s.   In this talk\, I challenge this assumption.  We find that the density
  learned by flow-based models\, VAEs\, and PixelCNNs cannot distinguish im
 ages of common objects such as dogs\, trucks\, and horses from those of ho
 use numbers\, assigning a higher likelihood to the latter when the model i
 s trained on the former.  We posit that this phenomenon is caused by a mis
 match between the model’s typical set and its areas of high probability 
 density.  In-distribution inputs should reside in the former but not neces
 sarily in the latter.  To determine whether or not inputs reside in the ty
 pical set\, we propose a computationally efficient hypothesis test using t
 he empirical distribution of model likelihoods.  Experiments show that thi
 s test succeeds in detecting out-of-distribution inputs in many cases in w
 hich previously proposed threshold-based techniques fail.
LOCATION:FW26\, Computer Laboratory
END:VEVENT
END:VCALENDAR
