- This event has passed.
Fragility of Asymptotic Agreement under Bayesian Learning
March 21, 2008 @ 11:00 am - 12:00 pm
David Forney (MIT LIDS)
Event Navigation
One of Shannon’s earliest results was his determination of the capacity of the binary symmetric channel (BSC). Shannon went on to show that, with randomly chosen codes and optimal decoding, the probability of decoding error decreases exponentially for any transmission rate less than capacity. Much of the important early work of Shannon, Elias, Fano and Gallager was devoted to determining bounds on the corresponding “error exponent.” A later approach to this problem, pioneered by Csiszar and Korner, and now adopted in many modern information theory courses such as 6.441, determines error exponents using large-deviation theory. This approach has several advantages: (a) It can be remarkably simple and intuitive; (b) It gives correct error exponents, not just bounds; (c) It gives insight into how decoding errors typically occur. We illustrate this approach by developing the correct error exponents at all rates for both random codes and random linear codes on the BSC, assuming optimal decoding. We discuss how decoding errors typically occur. In particular, we show why the classical algebraic coding approach never had any hope of approaching channel capacity, even on the BSC. This talk is intended to be pitched at an elementary and tutorial level.