Extremes of Random Coding Error Exponents
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Rachel Fogg.
In this talk, we will briefly review Gallager’s random coding achievability proof. We will show that Gallager’s random coding error exponent of an arbitrary binary-input memoryless symmetric channel is upper-bounded by that of the binary erasure channel and lower-bounded by that of the binary-symmetric channel of the same capacity. We will illustrate how the result can be applied to find the extremes of the channel dispersion for the aforementioned class of channels.
This talk is part of the Signal Processing and Communications Lab Seminars series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|