Information Theory, Codes, and Compression
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Alessandro Davide Ialongo.
Information Theory and Machine Learning are intimately related fields, and perhaps two sides of the same coin. This tutorial gives a basic introduction to information theory and code construction, and shows how Bayesian inference can be used to solve some interesting problems in communication. A special focus will be on Bayesian approaches to data compression, randomness, and the relation to perfect sampling algorithms.
The talk aims to be fairly accessible and easy to follow.
No advance reading is required.
This talk is part of the Machine Learning Reading Group @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|