Views Navigation

Event Views Navigation

Calendar of Events

S Sun

M Mon

T Tue

W Wed

T Thu

F Fri

S Sat

0 events,

0 events,

0 events,

0 events,

0 events,

1 event,

Stochastics and Statistics Seminar Giles Hooker, Wharton School - UPenn

0 events,

0 events,

0 events,

0 events,

0 events,

0 events,

1 event,

Stochastics and Statistics Seminar Tudor Manole, MIT

0 events,

0 events,

0 events,

0 events,

0 events,

0 events,

1 event,

Stochastics and Statistics Seminar Anna Korba, ENSAE/CREST

0 events,

0 events,

0 events,

0 events,

0 events,

0 events,

1 event,

Stochastics and Statistics Seminar Tijana Zrnic, Stanford University

0 events,

0 events,

0 events,

0 events,

0 events,

0 events,

1 event,

Stochastics and Statistics Seminar Christopher Harshaw, Columbia University

0 events,

Trees and V’s: Inference for Ensemble Models

Giles Hooker, Wharton School - UPenn
E18-304

Abstract: This talk discusses uncertainty quantification and inference using ensemble methods. Recent theoretical developments inspired by random forests have cast bagging-type methods as U-statistics when bootstrap samples are replaced by subsamples, resulting in a central limit theorem and hence the potential for inference. However, to carry this out requires estimating a variance for which all proposed estimators exhibit substantial upward bias. In this talk, we convert subsamples without replacement to subsamples with replacement resulting in V-statistics for which we prove…

Find out more »

Central Limit Theorems for Smooth Optimal Transport Maps

Tudor Manole, MIT
E18-304

Abstract: One of the central objects in the theory of optimal transport is the Brenier map: the unique monotone transformation which pushes forward an absolutely continuous probability law onto any other given law. Recent work has identified a class of plugin estimators of Brenier maps which achieve the minimax L^2 risk, and are simple to compute. In this talk, we show that such estimators obey pointwise central limit theorems. This provides a first step toward the question of performing statistical…

Find out more »

Sampling through optimization of divergences on the space of measures

Anna Korba, ENSAE/CREST
E18-304

Abstract: Sampling from a target measure when only partial information is available (e.g. unnormalized density as in Bayesian inference, or true samples as in generative modeling) is a fundamental problem in computational statistics and machine learning. The sampling problem can be cast as an optimization one over the space of probability distributions of a well-chosen discrepancy,  e.g. a divergence or distance to the target. In this talk, I will discuss several properties of sampling algorithms for some choices of discrepancies (standard ones,…

Find out more »

A Flexible Defense Against the Winner’s Curse

Tijana Zrnic, Stanford University
E18-304

Abstract: Across science and policy, decision-makers often need to draw conclusions about the best candidate among competing alternatives. For instance, researchers may seek to infer the effectiveness of the most successful treatment or determine which demographic group benefits most from a specific treatment. Similarly, in machine learning, practitioners are often interested in the population performance of the model that empirically performs best. However, cherry-picking the best candidate leads to the winner’s curse: the observed performance for the winner is biased…

Find out more »

The Conflict Graph Design: Estimating Causal Effects Under Interference

Christopher Harshaw, Columbia University
E18-304

Abstract: From clinical trials to corporate strategy, randomized experiments are a reliable methodological tool for estimating causal effects. In recent years, there has been a growing interest in causal inference under interference, where treatment given to one unit can affect outcomes of other units. While the literature on interference has focused primarily on unbiased and consistent estimation, designing randomized network experiments to insure tight rates of convergence is relatively under-explored. Not only are the optimal rates of estimation for different…

Find out more »


MIT Statistics + Data Science Center
Massachusetts Institute of Technology
77 Massachusetts Avenue
Cambridge, MA 02139-4307
617-253-1764