![]() |
COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. | ![]() |
University of Cambridge > Talks.cam > Information Theory Seminar > Data Compression with Relative Entropy Coding
Data Compression with Relative Entropy CodingAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Prof. Ramji Venkataramanan. In lossy source coding, one aims to encode data using as few bits as possible while ensuring we can approximately recover the data from the code with as little distortion as possible. Typically, lossy source coding algorithms comprise two steps: 1) they discard information by quantising the source, and 2) they encode the quantised representations losslessly with entropy coding algorithms such as arithmetic or Huffman coding. Unfortunately, this approach makes combining lossy source coding with modern machine learning approaches challenging since quantisation is a non-differentiable operation. However, quantisation is not the only mechanism we could use to discard information. Instead, we could perturb it (e.g. with Gaussian noise) and encode the perturbed representation; this is the essence of relative entropy coding. This talk provides an introduction to relative entropy coding. I begin by discussing the formal definition of the problem: what should “encoding a perturbed version of the data” even mean? Then, I provide an overview of its applications, such as how we can use it for learned data compression, low-rate compression with realism constraints and differential privacy. Next, I describe a relative entropy coding algorithm I developed: greedy Poisson rejection sampling (GPRS). This involves a somewhat unorthodox use of Poisson processes: they form the basis of the algorithm’s construction! Finally, I discuss recent results on the fundamental limits of the communication and computational complexity of relative entropy coding algorithms and highlight some interesting open problems and future research directions. This talk is part of the Information Theory Seminar series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsBiophysical Seminar Series 2016/17 HDR UK Cambridge Seminar Series From Physics to Machine Learning: opportunities in the Artificial Intelligence revolutionOther talksFrucht theorem for finite quantum groups Unveiling the Diversity-Stability Bound in Large Ecological Communities Quantum Information Quantum Groups Operator Algebras Twisted holography in AdS3/CFT2 |