- This event has passed.
Topics in Information and Inference Seminar
November 1, 2018 @ 4:00 pm - 5:00 pm
Abbas El Gamal (Stanford University)
Event Navigation
*This lecture is the second of two. The first lecture was given Thursday, October 25th.
Title: Randomness and Information I and II
Abstract: Exact or approximate generation of random variables with prescribed statistics from a given randomness source has many important applications, including random number generation from physical sources, Monte Carlo simulations, and randomized algorithms, e.g., for cryptography, optimization, and machine learning. It is also closely related to several fundamental questions in information theory, CS theory, and quantum information. The framework, measures, and techniques of information theory have played a key role in determining the fundamental limits on various settings in this area. In these two lectures, I will give a tutorial on the basic results in randomness generation (generating random variables from fair coin flips) and extraction (generating fair coin flips from a random process) with emphasis on their fundamental limits. In the first lecture I will show that (as in compression) entropy naturally arises as the limit on both exact and approximate generation and extraction. I will present classic results on exact generation by Knuth and Yao and on exact extraction by van Neumann and Elias, as well as some follow on work. I will then discuss the basic results on approximate randomness generation and extraction, including the use of random coding and binning to show the existence of optimal schemes. However, I will only briefly touch on the vast literature on approximate extraction and its applications in computer science. The second lecture will deal with multi terminal generation and extraction settings. I will first present classic results on approximate simulation of channel output statistics (resolvability) by Han and Verdú and on approximate distributed generation by Wyner. We will see that mutual information arises as a natural limit in both settings. I will then present results on exact distributed randomness generation and on exact distributed random key extraction from correlated sources. Finally I will discuss several natural common randomness measures that arise from these results. I will provide proofs for several of the results presented albeit not always in their most general or complete forms. I will be giving a third lecture on our recent results on distributed randomness generation (mostly based on work with Cheuk Ting Li) in the LIDS seminar on November 13.
The full schedule of this Seminar series can be viewed here: http://stellar.mit.edu/S/project/infoandinf/
This seminar consists of a series of lectures each followed by a period of informal discussion and social. The topics are at the nexus of information theory, inference, causality, estimation, and non-convex optimization. The lectures are intended to be tutorial in nature with the goal of learning about interesting and exciting topics rather than merely hearing about the most recent results. The topics are driven by the interests of the speakers, and with the exception of the two lectures on randomness and information, there is no planned coherence or dependency among them. Ad hoc follow-on meetings about any of the topics presented are highly encouraged.