University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Lossy Compression Coding Theorems for Arbitrary Sources

Lossy Compression Coding Theorems for Arbitrary Sources

Add to your list(s) Download to your calendar using vCal

  • UserYiannis Kontoyiannis (University of Cambridge; Athens University of Economics and Business)
  • ClockMonday 23 July 2018, 11:00-11:45
  • HouseSeminar Room 1, Newton Institute.

If you have a question about this talk, please contact INI IT.

MQIW05 - Beyond I.I.D. in information theory

We give a development of the theory of lossy data compression from the point of view of statistics. This is partly motivated by the enormous success of the statistical approach in lossless compression. A precise characterization of the fundamental limits of compression performance is given, for arbitrary data sources and with respect to general distortion measures. The emphasis is on non-asymptotic results and results that hold with high probability (and not just on the average). The starting point for this development is the observation that there is a precise correspondence between compression algorithms and probability distributions (in analogy with the Kraft inequality in lossless compression). This leads us to formulate a version of the celebrated Minimum Description Length (MDL) principle for lossy data compression. We discuss the consequences of the lossy MDL principle, and we explain how it can lead to practical design lessons for vector quantizer design.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity