BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//MIT Statistics and Data Science Center - ECPv5.14.2.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:MIT Statistics and Data Science Center
X-ORIGINAL-URL:https://stat.mit.edu
X-WR-CALDESC:Events for MIT Statistics and Data Science Center
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20180311T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20181104T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20180928T110000
DTEND;TZID=America/New_York:20180928T120000
DTSTAMP:20220528T133958
CREATED:20180620T212158Z
LAST-MODIFIED:20180924T130632Z
UID:2681-1538132400-1538136000@stat.mit.edu
SUMMARY:Reverse hypercontractivity beats measure concentration for information theoretic converses
DESCRIPTION:Abstract: Concentration of measure refers to a collection of tools and results from analysis and probability theory that have been used in many areas of pure and applied mathematics. Arguably\, the first data science application of measure concentration (under the name ‘‘blowing-up lemma’’) is the proof of strong converses in multiuser information theory by Ahlswede\, G’acs and K”orner in 1976. Since then\, measure concentration has found applications in many other information theoretic problems\, most notably the converse (impossibility) results in information theory. Motivated by this\, information theorists (e.g. Marton) have also contributed to the mathematical foundations of measure concentration using their information-theoretic techniques. \nNow\, after all the past 40 years of such progress\, we found that\, amusingly\, measure concentration is not the right hammer for many of these information theoretic applications. We introduce a new machinery based on functional inequalities and reverse hypercontractivity which yields strict improvements in terms of sharpness of the bounds\, generality of the source/channel distributions\, and simplicity of the proofs. Examples covered in the talk include: 1. optimal second-order converses to distributed source-type problems (hypothesis testing\, common randomness generation\, and source coding); 2. sharpening the recent relay channel converse bounds by Wu and Ozgur with much simpler proofs. \nThe work benefited from collaborations with Thomas Courtade\, Paul Cuff\, Ayfer Ozgur\, Ramon van Handel\, and Sergio Verd’u \n Biography: Jingbo Liu received the B.E. degree from Tsinghua University\, Beijing\, China in 2012\, and the M.A. and Ph.D. degrees from Princeton University\, Princeton\, NJ\, USA\, in 2014 and 2018\, all in electrical engineering. His research interests include signal processing\, information theory\, coding theory\, high dimensional statistics\, and the related fields. His undergraduate thesis received the best undergraduate thesis award at Tsinghua University (2012). He gave a semi-plenary presentation at the 2015 IEEE Int. Symposium on Information Theory\, Hong-Kong\, China. He was a recipient of the Princeton University Wallace Memorial Honorific Fellowship in 2016.
URL:https://stat.mit.edu/calendar/jingbo-liu/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar
GEO:42.3620185;-71.0878444
END:VEVENT
END:VCALENDAR