Loading Events
  • This event has passed.
Stochastics and Statistics Seminar

Reverse hypercontractivity beats measure concentration for information theoretic converses

September 28, 2018 @ 11:00 am - 12:00 pm

Jingbo Liu (MIT)


Abstract: Concentration of measure refers to a collection of tools and results from analysis and probability theory that have been used in many areas of pure and applied mathematics. Arguably, the first data science application of measure concentration (under the name ‘‘blowing-up lemma’’) is the proof of strong converses in multiuser information theory by Ahlswede, G’acs and K”orner in 1976. Since then, measure concentration has found applications in many other information theoretic problems, most notably the converse (impossibility) results in information theory. Motivated by this, information theorists (e.g. Marton) have also contributed to the mathematical foundations of measure concentration using their information-theoretic techniques.

Now, after all the past 40 years of such progress, we found that, amusingly, measure concentration is not the right hammer for many of these information theoretic applications. We introduce a new machinery based on functional inequalities and reverse hypercontractivity which yields strict improvements in terms of sharpness of the bounds, generality of the source/channel distributions, and simplicity of the proofs. Examples covered in the talk include: 1. optimal second-order converses to distributed source-type problems (hypothesis testing, common randomness generation, and source coding); 2. sharpening the recent relay channel converse bounds by Wu and Ozgur with much simpler proofs.

The work benefited from collaborations with Thomas Courtade, Paul Cuff, Ayfer Ozgur, Ramon van Handel, and Sergio Verd’u

Biography: Jingbo Liu received the B.E. degree from Tsinghua University, Beijing, China in 2012, and the M.A. and Ph.D. degrees from Princeton University, Princeton, NJ, USA, in 2014 and 2018, all in electrical engineering. His research interests include signal processing, information theory, coding theory, high dimensional statistics, and the related fields. His undergraduate thesis received the best undergraduate thesis award at Tsinghua University (2012). He gave a semi-plenary presentation at the 2015 IEEE Int. Symposium on Information Theory, Hong-Kong, China. He was a recipient of the Princeton University Wallace Memorial Honorific Fellowship in 2016.

MIT Statistics + Data Science Center
Massachusetts Institute of Technology
77 Massachusetts Avenue
Cambridge, MA 02139-4307