BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Statistics
SUMMARY:Communication in the presence of sparsity - Yianni
s Kontoyiannis (Cambridge)
DTSTART;TZID=Europe/London:20190529T140000
DTEND;TZID=Europe/London:20190529T150000
UID:TALK125320AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/125320
DESCRIPTION:In his seminal 1948 work\, Claude Shannon determin
ed the fundamental limits of\nthe best achievable
performance in point-to-point communication. His a
nalysis\ndepended on two assumptions: That data ar
e communicated in asymptotically\nlarge blocks\, a
nd that the information sources and the noise chan
nels involved\nsatisfy certain statistical regular
ity properties. We revisit the central\ninformatio
n-theoretic problems of efficient data compression
and channel\ntransmission without these assumptio
ns. First\, we describe a general\ndevelopment of
non-asymptotic coding theorems\, providing useful
expressions\nfor the fundamental limits of perform
ance on finite data. These results may\nplay an im
portant role in applications where performance or
delay guarantees\nare critical\, and they offer va
luable operational guidelines for the design of\np
ractical compression algorithms and error correcti
ng codes. Second\, motivated\nby modern applicatio
ns involving sparse and often “big” data (e.g.\, i
n\nneuroscience\, web and social network analysis\
, medical imaging\, and optical\nmedia recording)\
, we state and prove a series of theorems that det
ermine the\nbest achievable rates of compression a
nd transmission\, when the information or\nthe noi
se are appropriately “sparse”. Interestingly\, in
these cases\, the\nclassical results in terms of t
he entropy and channel capacity are shown to be\ni
naccurate\, even as first-order approximations.\n\
nNo background in information theory will be assum
ed.\n
LOCATION:Centre for Mathematical Sciences\, meeting room MR
14
CONTACT:HoD Secretary\, DPMMS
END:VEVENT
END:VCALENDAR