University of Cambridge > Talks.cam > Information Theory Seminar > A functional perspective on Information Measures

A functional perspective on Information Measures

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Prof. Ramji Venkataramanan.

Information Measures are indisputably the main characters in Information Theory. Shannon completely characterised the problem of compression via Entropy and the problem of noisy communication via Mutual Information and Capacity. Over the years, numerous novel information measures have been defined, all sharing similar properties. However, some of these quantities have yet to be associated with practical applications. In this presentation, I will provide a perspective on these objects which enables a better understanding of their connection to practical problems. Moreover, I will demonstrate the practical application of these ideas by employing Information Measures in various scenarios. These include bounding the generalisation error in Learning Theory, establishing impossibility results in Estimation Theory, and addressing concentration phenomena for non-independent random variables.

This talk is part of the Information Theory Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity