BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Analytic information theory and beyond - Szpankowski\, W (Purdue)
DTSTART:20100603T140000Z
DTEND:20100603T150000Z
UID:TALK25076@talks.cam.ac.uk
CONTACT:Mustapha Amrani
DESCRIPTION:Analytic information theory aims at studying problems of infor
 mation theory using analytic techniques of computer science and combinator
 ics. Following Hadamard's precept\, we tackle these problems by complex an
 alysis methods such as generating functions\, Mellin transform\, Fourier s
 eries\, saddle point method\, analytic poissonization and depoissonization
 \, and singularity analysis. This approach lies at the crossroad of comput
 er science and information theory. In this talk\, we concentrate on one fa
 cet of information theory (i.e.\, source coding better known as data compr
 ession)\, namely the redundancy rate problem. The redundancy rate problem 
 for a class of sources is the determination of how far the actual code len
 gth exceeds the optimal (ideal) code length. In a minimax scenario one fin
 ds the is the determination of how far the actual code length exceeds the 
 optimal (ideal) code length. In a minimax scenario one finds the maximal r
 edundancy over all sources within a certain class while in the average sce
 nario one computes the average redundancy over all possible sources. The r
 edundancy rate problem is typical of a situation where second-order asympt
 otics play a crucial role (since the leading term of the optimal code leng
 th is known to be the entropy of the source). This problem is an ideal can
 didate for ``analytic information theory''. We survey our recent results o
 n the redundancy rate problem obtained by analytic methods. In particular\
 , we present our findings for the redundancy of Shannon codes\, Huffman co
 des\, minimax redundancy for memoryless sources\, Markov sources\, and ren
 ewal processes. In the second part of the talk\, we discuss the limitation
 s of classical Shannon information theory\, and argue that there is need f
 or post-Shannon information theory.
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
