COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Churchill CompSci Talks > Understanding the Source Coding Theorem: A Talk on Shannon’s Entropy
Understanding the Source Coding Theorem: A Talk on Shannon’s EntropyAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Matthew Ireland. Online only How many bits do we need to encode a sequence of English characters? Can we do better by considering the relative frequency of each character? Is there a theoretical limit to how we encode data with negligible risk of information loss? In this talk, we first define Shannon’s entropy, which quantifies the predictability of a sequence of random variables. We will also explore the Source Coding Theorem, which provides an operational definition of entropy by establishing a fundamental limit to the compressibility of information. Finally, we will prove this theorem with some surprisingly simple results in Probability Theory. This talk is part of the Churchill CompSci Talks series. This talk is included in these lists:Note that ex-directory lists are not shown. |
Other listsBBMS Fastmusic Professor Sir Brian HeapOther talksData Introduction to the biology of metastasis Biomimetic vascular self-healing systems for cementitious materials A V Hill Lecture - Kings and Queens of the Mountain: Studies of Extreme Physiology in Himalayan Sherpas Ethics for the working mathematician, discussion 5: Regulation, accountability, and the law Statistics Clinic Michaelmas 2020 - Skype session II |