University of Cambridge > Talks.cam > Information Theory Seminar > Entropy comparison inequalities with applications to additive noise channels

Entropy comparison inequalities with applications to additive noise channels

Add to your list(s) Download to your calendar using vCal

  • UserDr Lampros Gavalakis, University of Cambridge World_link
  • ClockWednesday 19 March 2025, 14:00-15:00
  • HouseMR5, CMS Pavilion A.

If you have a question about this talk, please contact Prof. Ramji Venkataramanan.

Motivated by a question of Eskenazis, Nayar and Tkocz (2018), which remains open, we will present an inequality involving the entropy of a sum of two i.i.d. random variables when one is replaced by a Gaussian with the same variance. This inequality involves a quantity known as entropic doubling, which is small when the Entropy Power Inequality (EPI) is close to equality. We will then discuss the question of stability in the EPI and present a qualitative stability result in the weak sense. Furthermore, by extending our inequality to independent random variables, we will show how we can improve the “half-a-bit” bound of Zamir and Erez (2004), related to the robustness of Gaussian codebooks, in the low capacity regime. Finally, we will present extensions to MAC and MIMO channels with general additive noise.

The talk is based on joint work with Ioannis Kontoyiannis and Mokshay Madiman.

This talk is part of the Information Theory Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity