University of Cambridge > > DAMTP Information Theory Seminar > Refinement of Two Fundamental Tools in Information Theory

Refinement of Two Fundamental Tools in Information Theory

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Nicholas Teh.

In Shannon’s original paper and many references in information theory, the entropy of a discrete random variable is assumed or shown to be a continuous function. However, we found that all Shannon’s information measures including entropy and mutual information are discontinuous in the general case that the random variables take values in possibly countably infinite alphabets. This fundamental property explains why strong typicality and Fano’s inequality can only be applied on finite alphabets. Note that strong typicality and Fano’s inequality have wide applications in information theory so that it is important to extend them in full generality.

In this talk, details about the discontinuity of Shannon’s information measures will be given. We will show how these results lead to a new definition of typicality and an inequality tighter than Fano’s inequality. Applications in network coding and information theoretic security will be discussed.

This talk is part of the DAMTP Information Theory Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity