BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:ML@CL Seminar Series
SUMMARY:Inter-domain Deep Gaussian Processes - Tim G. J. R
 udner\, University of Oxford
DTSTART;TZID=Europe/London:20201016T130000
DTEND;TZID=Europe/London:20201016T150000
UID:TALK151939AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/151939
DESCRIPTION:*1-on-1 Meetings with the Speaker*:\n\nThe talk is
  scheduled for one hour and will be followed by on
 e hour of 1-on-1 meetings with the speaker. To boo
 k a 1-on-1 slot with the speaker\, email the organ
 izer before the event.\n\n\n*Paper:*\n\n"Inter-dom
 ain Deep Gaussian Processes":http://timrudner.com/
 papers/Inter-domain_Deep_Gaussian_Processes/Rudner
 2020_Inter-domain_Deep_Gaussian_Processes.pdf (ICM
 L 2020)\n\n*Abstract:*\n\nInter-domain Gaussian pr
 ocesses (GPs) allow for high flexibility and low c
 omputational cost when performing approximate infe
 rence in GP models. They are particularly suitable
  for modeling data exhibiting global structure but
  are limited to stationary covariance functions an
 d thus fail to model non-stationary data effective
 ly. We propose Inter-domain Deep Gaussian Processe
 s\, an extension of inter-domain shallow GPs that 
 combines the advantages of inter-domain and deep G
 aussian processes (DGPs)\, and demonstrate how to 
 leverage existing approximate inference methods to
  perform simple and scalable approximate inference
  using inter-domain features in DGPs. We assess th
 e performance of our method on a range of regressi
 on tasks and demonstrate that it outperforms inter
 -domain shallow GPs and conventional DGPs on chall
 enging large-scale real-world datasets exhibiting 
 both global structure as well as a high-degree of 
 non-stationarity.\n\n*Keywords:* Gaussian Processe
 s\, Variational Inference\, Bayesian Deep Learning
 \n\n\n*About the Speaker:*\n\nTim G. J. Rudner is 
 a PhD Candidate in the Department of Computer Scie
 nce at the University of Oxford\, supervised by Ya
 rin Gal and Yee Whye Teh. His research interests s
 pan Bayesian deep learning\, reinforcement learnin
 g\, and variational inference. He holds a master’s
  degree in statistics from the University of Oxfor
 d and an undergraduate degree in mathematics and e
 conomics from Yale University. Tim is also an AI F
 ellow at Georgetown University's Center for Securi
 ty and Emerging Technology (CSET)\, a Fellow of th
 e German National Academic Foundation\, and a Rhod
 es Scholar.\n\n*Website:* "http://timrudner.com":h
 ttp://timrudner.com\n\n\n\nThis talk is part of th
 e ML@CL Seminar Series with a focus on early caree
 r researchers and topics relevant to machine learn
 ing and statistics.
LOCATION:https://dtudk.zoom.us/j/63696188914?pwd=L3RNZFlVWj
 dvdCtwMTAzaXA0UHVlQT09
CONTACT:Francisco Vargas
END:VEVENT
END:VCALENDAR
