Topics in Information and Inference Seminar

Guy Bresler (MIT)
32-D677

This seminar consists of a series of lectures each followed by a period of informal discussion and social. The topics are at the nexus of information theory, inference, causality, estimation, and non-convex optimization. The lectures are intended to be tutorial in nature with the goal of learning about interesting and exciting topics rather than merely hearing about the most recent results. The topics are driven by the interests of the speakers, and with the exception of the two lectures on…

Find out more »

Local Geometric Analysis and Applications

Lizhong Zheng (MIT)
32-D677

Abstract: Local geometric analysis is a method to define a coordinate system in a small neighborhood in the space of distributions over a given alphabet. It is a powerful technique since the notions of distance, projection, and inner product defined this way are useful in the optimization problems involving distributions, such as regressions. It has been used in many places in the literature such as correlation analysis, correspondence analysis. In this talk, we will go through some of the basic…

Find out more »

Graphical models under total positivity

Caroline Uhler (MIT)
32-D677

Title: Graphical models under total positivity Abstract: We discuss properties of distributions that are multivariate totally positive of order two (MTP2). Such distributions appear in the context of positive dependence, ferromagnetism in the Ising model, and various latent models. While such distributions have a long history in probability theory and statistical physics, in this talk I will discuss such distributions in the context of high dimensional statistics and graphical models. In particular, I will show that MTP2 in the Gaussian…

Find out more »

Strong data processing inequalities and information percolation

Yury Polyanskiy (MIT)
32-D677

Title: Strong data processing inequalities and information percolation Abstract:  The data-processing inequality, that is, $I(U;Y) \le I(U;X)$ for a Markov chain $U \to X \to Y$, has been the method of choice for proving impossibility (converse) results in information theory and many other disciplines. A channel-dependent improvement is called the strong data-processing inequality (or SDPI). In this talk we will: a) review SDPIs; b) show how point-to-point SDPIs can be combined into an SDPI for a network; c) show recent…

Find out more »


© MIT Statistics + Data Science Center | 77 Massachusetts Avenue | Cambridge, MA 02139-4307 | 617-253-1764 |
      
Accessibility