
Past Events
Events Search and Views Navigation
March 2021
Detection Thresholds for Distribution-Free Non-Parametric Tests: The Curious Case of Dimension 8
Bhaswar B. Bhattacharya, UPenn Wharton
Abstract: Two of the fundamental problems in non-parametric statistical inference are goodness-of-fit and two-sample testing. These two problems have been extensively studied and several multivariate tests have been proposed over the last thirty years, many of which are based on geometric graphs. These include, among several others, the celebrated Friedman-Rafsky two-sample test based on the minimal spanning tree and the K-nearest neighbor graphs, and the Bickel-Breiman spacings tests for goodness-of-fit. These tests are asymptotically distribution-free, universally consistent, and computationally efficient…
Find out more »WiDS Cambridge 2021
For the fifth year in a row, Harvard, MIT, Microsoft Research New England, and Broad Institute are proud to collaborate with Stanford University to bring the Women in Data Science (WiDS) conference to Cambridge, Massachusetts. This virtual, one-day technical conference will feature an all-female line up of speakers from academia and industry to talk about the latest data science-related research in a number of domains, to learn how leading-edge companies are leveraging data science for success, and to connect with potential mentors, collaborators,…
Find out more »On nearly assumption-free tests of nominal confidence interval coverage for causal parameters estimated by machine learning
James Robins, Harvard
Abstract: For many causal effect parameters of interest, doubly robust machine learning (DRML) estimators ψ̂ 1 are the state-of-the-art, incorporating the good prediction performance of machine learning; the decreased bias of doubly robust estimators; and the analytic tractability and bias reduction of sample splitting with cross fitting. Nonetheless, even in the absence of confounding by unmeasured factors, the nominal (1−α) Wald confidence interval ψ̂ 1±zα/2ˆ may still undercover even in large samples, because the bias of ψ̂ 1 may be of the same…
Find out more »Relaxing the I.I.D. Assumption: Adaptively Minimax Optimal Regret via Root-Entropic Regularization
Daniel Roy, University of Toronto
Abstract: We consider sequential prediction with expert advice when data are generated from distributions varying arbitrarily within an unknown constraint set. We quantify relaxations of the classical i.i.d. assumption in terms of these constraint sets, with i.i.d. sequences at one extreme and adversarial mechanisms at the other. The Hedge algorithm, long known to be minimax optimal in the adversarial regime, was recently shown to be minimax optimal for i.i.d. data. We show that Hedge with deterministic learning rates is suboptimal…
Find out more »Testing the I.I.D. assumption online
Vladimir Vovk, Royal Holloway, University of London
Abstract: Mainstream machine learning, despite its recent successes, has a serious drawback: while its state-of-the-art algorithms often produce excellent predictions, they do not provide measures of their accuracy and reliability that would be both practically useful and provably valid. Conformal prediction adapts rank tests, popular in nonparametric statistics, to testing the IID assumption (the observations being independent and identically distributed). This gives us practical measures, provably valid under the IID assumption, of the accuracy and reliability of predictions produced by…
Find out more »April 2021
Sampler for the Wasserstein barycenter
Thibaut Le Gouic, MIT
Abstract: Wasserstein barycenters have become a central object in applied optimal transport as a tool to summarize complex objects that can be represented as distributions. Such objects include posterior distributions in Bayesian statistics, functions in functional data analysis and images in graphics. In a nutshell a Wasserstein barycenter is a probability distribution that provides a compelling summary of a finite set of input distributions. While the question of computing Wasserstein barycenters has received significant attention, this talk focuses on a…
Find out more »How Can Governments Facilitate the Integration of Newcomers? Building an Evidence and Innovation Agenda for Migration Research
Jens Hainmueller (Stanford University)
Please join us on Tuesday, April 6, 2021 at 3:00pm for the Distinguished Speaker Seminar with Jens Hainmueller (Stanford University).
Find out more »Functions space view of linear multi-channel convolution networks with bounded weight norm
Suriya Gunasekar, Microsoft Research
Abstract: The magnitude of the weights of a neural network is a fundamental measure of complexity that plays a crucial role in the study of implicit and explicit regularization. For example, in recent work, gradient descent updates in overparameterized models asymptotically lead to solutions that implicitly minimize the ell_2 norm of the parameters of the model, resulting in an inductive bias that is highly architecture dependent. To investigate the properties of learned functions, it is natural to consider a function…
Find out more »AI for Healthcare Equity Conference
The potential of AI to bring equity in healthcare has spurred significant research efforts across academia, industry and government. Racial, gender and socio-economic disparities have traditionally afflicted healthcare systems in ways that are difficult to detect and quantify. New AI technologies, however, provide a platform for change.
Find out more »Sample Size Considerations in Precision Medicine
Eric Laber, Duke University
Abstract: Sequential Multiple Assignment Randomized Trials (SMARTs) are considered the gold standard for estimation and evaluation of treatment regimes. SMARTs are typically sized to ensure sufficient power for a simple comparison, e.g., the comparison of two fixed treatment sequences. Estimation of an optimal treatment regime is conducted as part of a secondary and hypothesis-generating analysis with formal evaluation of the estimated optimal regime deferred to a follow-up trial. However, running a follow-up trial to evaluate an estimated optimal treatment regime…
Find out more »- About
- People▼
- Academics▼
- Interdisciplinary Doctoral Program in Statistics▼
- Interdisciplinary PhD in Aero/Astro and Statistics
- Interdisciplinary PhD in Brain and Cognitive Sciences and Statistics
- Interdisciplinary PhD in Economics and Statistics
- Interdisciplinary PhD in Mathematics and Statistics
- Interdisciplinary PhD in Mechanical Engineering and Statistics
- Interdisciplinary PhD in Physics and Statistics
- Interdisciplinary PhD in Political Science and Statistics
- Interdisciplinary PhD in Social & Engineering Systems and Statistics
- Minor in Statistics and Data Science
- MicroMasters program in Statistics and Data Science
- Data Science and Machine Learning: Making Data-Driven Decisions
- Interdisciplinary Doctoral Program in Statistics
- Research
- News
- Events▲
- Seminars▼
- Jobs▼