University of Cambridge > > Statistics > Communication in the presence of sparsity

Communication in the presence of sparsity

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact HoD Secretary, DPMMS.

In his seminal 1948 work, Claude Shannon determined the fundamental limits of the best achievable performance in point-to-point communication. His analysis depended on two assumptions: That data are communicated in asymptotically large blocks, and that the information sources and the noise channels involved satisfy certain statistical regularity properties. We revisit the central information-theoretic problems of efficient data compression and channel transmission without these assumptions. First, we describe a general development of non-asymptotic coding theorems, providing useful expressions for the fundamental limits of performance on finite data. These results may play an important role in applications where performance or delay guarantees are critical, and they offer valuable operational guidelines for the design of practical compression algorithms and error correcting codes. Second, motivated by modern applications involving sparse and often “big” data (e.g., in neuroscience, web and social network analysis, medical imaging, and optical media recording), we state and prove a series of theorems that determine the best achievable rates of compression and transmission, when the information or the noise are appropriately “sparse”. Interestingly, in these cases, the classical results in terms of the entropy and channel capacity are shown to be inaccurate, even as first-order approximations.

No background in information theory will be assumed.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity