University of Cambridge > Talks.cam > Information Theory Seminar > Data Compression with Relative Entropy Coding

Data Compression with Relative Entropy Coding

Add to your list(s) Download to your calendar using vCal

  • UserGergely Flamich, University of Cambridge World_link
  • ClockWednesday 05 March 2025, 14:00-15:00
  • HouseMR5, CMS Pavilion A.

If you have a question about this talk, please contact Prof. Ramji Venkataramanan.

In lossy source coding, one aims to encode data using as few bits as possible while ensuring we can approximately recover the data from the code with as little distortion as possible. Typically, lossy source coding algorithms comprise two steps: 1) they discard information by quantising the source, and 2) they encode the quantised representations losslessly with entropy coding algorithms such as arithmetic or Huffman coding. Unfortunately, this approach makes combining lossy source coding with modern machine learning approaches challenging since quantisation is a non-differentiable operation. However, quantisation is not the only mechanism we could use to discard information. Instead, we could perturb it (e.g. with Gaussian noise) and encode the perturbed representation; this is the essence of relative entropy coding.

This talk provides an introduction to relative entropy coding. I begin by discussing the formal definition of the problem: what should “encoding a perturbed version of the data” even mean? Then, I provide an overview of its applications, such as how we can use it for learned data compression, low-rate compression with realism constraints and differential privacy. Next, I describe a relative entropy coding algorithm I developed: greedy Poisson rejection sampling (GPRS). This involves a somewhat unorthodox use of Poisson processes: they form the basis of the algorithm’s construction! Finally, I discuss recent results on the fundamental limits of the communication and computational complexity of relative entropy coding algorithms and highlight some interesting open problems and future research directions.

This talk is part of the Information Theory Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity