|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
A Bit of Network Information Theory
If you have a question about this talk, please contact Eiko Yoneki.
This talk is cancelled - due to the recent heavy snow, the speaker could not travel.
Network information theory deals with fundamental characterizations of information flow and/or compression over communication networks. While there are some satisfying characterizations for flows on networks represented as graphs, there have been few complete characterizations for networks with more complicated interactions such as those encountered in wireless networks. Given this discouraging state of affairs, a natural question to ask is whether there are methodologies to make progress on these questions.
In this talk we posit that one could gain insight into these problems by asking for less, i.e., by looking for (provable) approximate characterizations, with the hope that in terms of engineering solutions, these might actually be enough. Underlying these approximations are exact results for deterministic/lossless network communications.
We illustrate this approach by examining two problems. First is that of information flow over wireless Gaussian relay networks, where we show that the achievable rate is within a constant number of bits from the information-theoretic cut-set upper bound on the capacity of these networks. Underlying this result is an exact information-theoretic max-flow min-cut characterization of a deterministic relay network. The second is the approximate characterization of the K-description Gaussian multiple description source coding problem. Here we show that the rate region can be sandwiched between two polytopes with matching hyperplanes, separated by at most one bit. Underlying this is an exact characterization of a lossless multi-level source coding problem.
These results show that characterizations, within a constant number of bits approximation, may be a promising direction to get insight in network information theory problems. We will conclude with implications of this methodology by illustrating several applications such as wireless network secrecy.
Parts of this work were done in collaboration with S. Avestimehr (Cornell), S. Mohajer (EPFL), C. Tian (AT&T) and D. Tse (UC, Berkeley).
Bio: Suhas N. Diggavi received the B. Tech. degree in electrical engineering from the Indian Institute of Technology, Delhi, India, and the Ph.D. degree in electrical engineering from Stanford University, Stanford, CA, in December 1998.
After completing his Ph.D., he joined the Information Sciences Center, AT&T Shannon Laboratories, Florham Park, NJ, where he was a Principal Member Technical Staff. He is currently on the faculty of the School of Computer and Communication Sciences, EPFL , where he heads the Laboratory for Information and Communication Systems (LICOS). He will join the University of California, Los Angeles as a full professor in 2010. His research interests include information theory, network communications, data compression and signal processing. More information can be found at http://licos.epfl.ch.
He is a recipient of the 2006 IEEE Donald Fink prize paper award, 2005 IEEE Vehicular Technology Conference best paper award and the Okawa foundation research award. He is currently an Associate Editor for the IEEE /ACM Transactions on Networking and the IEEE Transactions on Information Theory. He has served as an associate editor for the IEEE Communication Letters, a guest editor for IEEE Selected Topics in Signal Processing and on the technical program committee for several conferences including ISIT , ICC, ITW , and Globecom. He has 8 issued patents.
This talk is part of the Computer Laboratory Systems Research Group Seminar series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsPathology Seminars PLACEB-O 'In Conversation' Seminar Series Cambridge Enterprise events
Other talksHack the Lab Roles of cytoskeleton in hippocampal synaptic plasticity Spectacular Chemistry Demonstration Lecture What is a Pet? Start-up designs for response-adpative randomization procedures Designs for generalized linear models with random block effects