Views Navigation

Event Views Navigation

Sharp Thresholds for Random Subspaces, and Applications

Mary Wootters - Stanford University
online

Abstract: What combinatorial properties are likely to be satisfied by a random subspace over a finite field? For example, is it likely that not too many points lie in any Hamming ball? What about any cube?  We show that there is a sharp threshold on the dimension of the subspace at which the answers to these questions change from "extremely likely" to "extremely unlikely," and moreover we give a simple characterization of this threshold for different properties. Our motivation comes…

Find out more »

Perfect Simulation for Feynman-Kac Models using Ensemble Rejection Sampling

Arnaud Doucet - University of Oxford
online

Abstract: I will introduce Ensemble Rejection Sampling, a scheme for perfect simulation of a class of Feynmac-Kac models. In particular, this scheme allows us to sample exactly from the posterior distribution of the latent states of a class of non-linear non-Gaussian state-space models and from the distribution of a class of conditioned random walks. Ensemble Rejection Sampling relies on a high-dimensional proposal distribution built using ensembles of state samples and dynamic programming. Although this algorithm can be interpreted as a…

Find out more »

A Local Convergence Theory for Mildly Over-Parameterized Two-Layer Neural Net

Rong Ge - Duke University
online

Abstract: The training of neural networks optimizes complex non-convex objective functions, yet in practice simple algorithms achieve great performances. Recent works suggest that over-parametrization could be a key ingredient in explaining this discrepancy. However, current theories could not fully explain the role of over-parameterization. In particular, they either work in a regime where neurons don't move much, or require large number of neurons. In this paper we develop a local convergence theory for mildly over-parameterized two-layer neural net. We show…

Find out more »

MIT Sports Summit 2021

online

The MIT Sports Lab invites you to the MIT Sports Summit 2021, a virtual event hosted on Thursday, Feb. 4th and Friday, Feb. 5th! It is an opportunity for the MIT community to interface with the Sports Lab’s affiliates and partners, sharing advances, challenges, and passions at the intersection of engineering and sports. We are featuring talks from leaders in industry and academia, as well as interactive sessions showcasing student research posters and sports tech startups. This is an invitation-only event for current MIT community…

Find out more »

Faster and Simpler Algorithms for List Learning

Jerry Li, Microsoft Research
online

Abstract: The goal of list learning is to understand how to learn basic statistics of a dataset when it has been corrupted by an overwhelming fraction of outliers. More formally, one is given a set of points $S$, of which an $\alpha$-fraction $T$ are promised to be well-behaved. The goal is then to output an $O(1 / \alpha)$ sized list of candidate means, so that one of these candidates is close to the true mean of the points in $T$.…

Find out more »

Self-regularizing Property of Nonparametric Maximum Likelihood Estimator in Mixture Models

Yury Polyanskiy, MIT
online

Abstract: Introduced by Kiefer and Wolfowitz 1956, the nonparametric maximum likelihood estimator (NPMLE) is a widely used methodology for learning mixture models and empirical Bayes estimation. Sidestepping the non-convexity in mixture likelihood, the NPMLE estimates the mixing distribution by maximizing the total likelihood over the space of probability measures, which can be viewed as an extreme form of over parameterization. In this work we discover a surprising property of the NPMLE solution. Consider, for example, a Gaussian mixture model on…

Find out more »


MIT Statistics + Data Science Center
Massachusetts Institute of Technology
77 Massachusetts Avenue
Cambridge, MA 02139-4307
617-253-1764