Views Navigation

Event Views Navigation

Calendar of Events

S Sun

M Mon

T Tue

W Wed

T Thu

F Fri

S Sat

0 events,

0 events,

0 events,

1 event,

IDS.190 Topics in Bayesian Modeling and Computation Jonathan Huggins (Boston University)

0 events,

0 events,

0 events,

0 events,

0 events,

0 events,

2 events,

IDS.190 Topics in Bayesian Modeling and Computation Lester Mackey (Microsoft Research)

One-shot Information Theory via Poisson Processes

0 events,

1 event,

Statistics and Data Science Seminar Yudong Chen (Cornell)

0 events,

0 events,

0 events,

0 events,

1 event,

IDS.190 Topics in Bayesian Modeling and Computation Xiao-Li Meng (Harvard University)

0 events,

1 event,

Statistics and Data Science Seminar Lenka Zdeborová (Institute of Theoretical Physics, CNRS)

0 events,

0 events,

0 events,

1 event,

Stability of a Fluid Model for Fair Bandwidth Sharing with General File Size Distributions

1 event,

IDS.190 Topics in Bayesian Modeling and Computation Francesca Dominici (Harvard University)

0 events,

1 event,

0 events,

0 events,

0 events,

0 events,

0 events,

0 events,

0 events,

0 events,

Using Bagged Posteriors for Robust Inference

Jonathan Huggins (Boston University)
37-212

IDS.190 – Topics in Bayesian Modeling and Computation **PLEASE NOTE ROOM CHANGE TO BUILDING 37-212 FOR THE WEEKS OF 10/30 AND 11/6** Speaker:   Jonathan Huggins (Boston University) Abstract: Standard Bayesian inference is known to be sensitive to misspecification between the model and the data-generating mechanism, leading to unreliable uncertainty quantification and poor predictive performance.…

Find out more »

Probabilistic Inference and Learning with Stein’s Method

Lester Mackey (Microsoft Research)
37-212

IDS.190 – Topics in Bayesian Modeling and Computation **PLEASE NOTE ROOM CHANGE TO BUILDING 37-212 FOR THE WEEKS OF 10/30 AND 11/6** Speaker: Lester Mackey (Microsoft Research) Abstract: Stein’s method is a powerful tool from probability theory for bounding the distance between probability distributions.  In this talk, I’ll describe how this tool designed to prove central…

Find out more »

One-shot Information Theory via Poisson Processes

Cheuk Ting Li (UC Berkeley)
E18-304

Abstract: In information theory, coding theorems are usually proved in the asymptotic regime where the blocklength tends to infinity. While there are techniques for finite blocklength analysis, they are often more complex than their asymptotic counterparts. In this talk, we study the use of Poisson processes in proving coding theorems, which not only gives sharp…

Find out more »

SDP Relaxation for Learning Discrete Structures: Optimal Rates, Hidden Integrality, and Semirandom Robustness

Yudong Chen (Cornell)
E18-304

Abstract: We consider the problems of learning discrete structures from network data under statistical settings. Popular examples include various block models, Z2 synchronization and mixture models. Semidefinite programming (SDP) relaxation has emerged as a versatile and robust approach to these problems. We show that despite being a relaxation, SDP achieves the optimal Bayes error rate…

Find out more »

Artificial Bayesian Monte Carlo Integration: A Practical Resolution to the Bayesian (Normalizing Constant) Paradox

Xiao-Li Meng (Harvard University)
E18-304

Abstract: Advances in Markov chain Monte Carlo in the past 30 years have made Bayesian analysis a routine practice. However, there is virtually no practice of performing Monte Carlo integration from the Bayesian perspective; indeed,this problem has earned the “paradox” label in the context of computing normalizing constants (Wasserman, 2013). We first use the modeling-what-we-ignore…

Find out more »

Understanding Machine Learning with Statistical Physics

Lenka Zdeborová (Institute of Theoretical Physics, CNRS)
E18-304

Abstract: The affinity between statistical physics and machine learning has long history, this is reflected even in the machine learning terminology that is in part adopted from physics. Current theoretical challenges and open questions about deep learning and statistical learning call for unified account of the following three ingredients: (a) the dynamics of the learning…

Find out more »

Stability of a Fluid Model for Fair Bandwidth Sharing with General File Size Distributions

Ruth J Williams (University of California, San Diego)
E18-304

Abstract: Massoulie and Roberts introduced a stochastic model for a data communication network where file sizes are generally distributed and the network operates under a fair bandwidth sharing policy.  It has been a standing problem to prove stability of this general model when the average load on the system is less than the network's capacity. A crucial step in an approach to…

Find out more »

A Causal Exposure Response Function with Local Adjustment for Confounding: A study of the health effects of long-term exposure to low levels of fine particulate matter

Francesca Dominici (Harvard University)
E18-304

Abstract:   In the last two decades, ambient levels of air pollution have declined substantially. Yet, as mandated by the Clean Air Act, we must continue to address the following question: is exposure to levels of air pollution that are well below the National Ambient Air Quality Standards (NAAQS) harmful to human health? Furthermore, the highly…

Find out more »

Automated Data Summarization for Scalability in Bayesian Inference

Tamara Broderick (MIT)
E18-304

Abstract: Many algorithms take prohibitively long to run on modern, large data sets. But even in complex data sets, many data points may be at least partially redundant for some task of interest. So one might instead construct and use a weighted subset of the data (called a “coreset”) that is much smaller than the…

Find out more »


MIT Statistics + Data Science Center
Massachusetts Institute of Technology
77 Massachusetts Avenue
Cambridge, MA 02139-4307
617-253-1764