BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Relative Entropy Coding for Learned Data Compression - Greg Flamic
 h\, Engineering Department (Cambridge)
DTSTART:20230523T120000Z
DTEND:20230523T130000Z
UID:TALK198307@talks.cam.ac.uk
CONTACT:Mateja Jamnik
DESCRIPTION:In recent years\, machine learning (ML) ignited a revolution i
 n data compression as researchers and engineers can now design codecs that
  learn how to encode information optimally from large datasets. These ML-b
 ased methods use deep generative models (DGM)\, such as variational autoen
 coders or diffusion models\, to build a distribution over the data. DGMs g
 enerate data by simulating a sample from a simple latent distribution\, su
 ch as a Gaussian\, which they transform into a sample from the data distri
 bution using a deep neural network. Hence\, we can encode data by encoding
  the latent sample that generated it and using the DGM to reconstruct it. 
 However\, a surprising fact is that traditional methods for encoding the l
 atent sample are suboptimal\, and a far more efficient approach exists cal
 led relative entropy coding (REC).\n\nIn this talk\, I will first give an 
 overview of learned data compression and some issues it faces and use it t
 o motivate REC. Then\, I will present a simple REC algorithm\, revealing a
  surprising equivalence between sampling and search. Finally\, I will disc
 uss the main limitations of current REC algorithms\, which prevent their p
 ractical application so far and lay out some potential ways to resolve the
 se limitations.\n\n"You can also join us on Zoom":https://zoom.us/j/991669
 55895?pwd=SzI0M3pMVEkvNmw3Q0dqNDVRalZvdz09
LOCATION:Lecture Theatre 2
END:VEVENT
END:VCALENDAR
