Social Networks and the Market for News
Please join us on October 5, 2020 at 4pm for the Distinguished Speaker Seminar with Rachel Kranton, James B. Duke Distinguished Professor of Economics at Duke University.
Please join us on October 5, 2020 at 4pm for the Distinguished Speaker Seminar with Rachel Kranton, James B. Duke Distinguished Professor of Economics at Duke University.
Abstract: One of the most basic problems in statistics is the estimation of the mean of a random vector, based on independent observations. This problem has received renewed attention in the last few years, both from statistical and computational points of view. In this talk we review some recent results on the statistical performance of mean estimators that allow heavy tails and adversarial contamination in the data. The basic punchline is that one can construct estimators that, under minimal conditions,…
Abstract: In this talk we discuss the idea of data- driven regularisers for inverse imaging problems. We are in particular interested in the combination of mathematical models and purely data-driven approaches, getting the best from both worlds. In this context we will make a journey from “shallow” learning for computing optimal parameters for variational regularisation models by bilevel optimization to the investigation of different approaches that use deep neural networks for solving inverse imaging problems. Bio: Carola-Bibiane Schönlieb is Professor of…
Abstract: Wasserstein-based distributional robust optimization problems are formulated as min-max games in which a statistician chooses a parameter to minimize an expected loss against an adversary (say nature) which wishes to maximize the loss by choosing an appropriate probability model within a certain non-parametric class. Recently, these formulations have been studied in the context in which the non-parametric class chosen by nature is defined as a Wasserstein-distance neighborhood around the empirical measure. It turns out that by appropriately choosing the…
Please join us on November 2, 2020 at 4pm for the Distinguished Speaker Seminar with Eric J. Tchetgen Tchetgen, Luddy Family President’s Distinguished Professor and Professor of Statistics at the University of Pennsylvania.
Abstract: As datasets continue to grow in size, in many settings the focus of data collection has shifted away from testing pre-specified hypotheses, and towards hypothesis generation. Researchers are often interested in performing an exploratory data analysis in order to generate hypotheses, and then testing those hypotheses on the same data; I will refer to this as 'double dipping'. Unfortunately, double dipping can lead to highly-inflated Type 1 errors. In this talk, I will consider the special case of hierarchical…
Abstract: What combinatorial properties are likely to be satisfied by a random subspace over a finite field? For example, is it likely that not too many points lie in any Hamming ball? What about any cube? We show that there is a sharp threshold on the dimension of the subspace at which the answers to these questions change from "extremely likely" to "extremely unlikely," and moreover we give a simple characterization of this threshold for different properties. Our motivation comes…
Abstract: I will introduce Ensemble Rejection Sampling, a scheme for perfect simulation of a class of Feynmac-Kac models. In particular, this scheme allows us to sample exactly from the posterior distribution of the latent states of a class of non-linear non-Gaussian state-space models and from the distribution of a class of conditioned random walks. Ensemble Rejection Sampling relies on a high-dimensional proposal distribution built using ensembles of state samples and dynamic programming. Although this algorithm can be interpreted as a…
Abstract: The training of neural networks optimizes complex non-convex objective functions, yet in practice simple algorithms achieve great performances. Recent works suggest that over-parametrization could be a key ingredient in explaining this discrepancy. However, current theories could not fully explain the role of over-parameterization. In particular, they either work in a regime where neurons don't move much, or require large number of neurons. In this paper we develop a local convergence theory for mildly over-parameterized two-layer neural net. We show…
IDSS will host Prof. Bruce Western as part of the Distinguished Speaker Seminar series. Prof. Westerns research has examined the causes, scope, and consequences of the historic growth in U.S. prison populations. He is Co-Director of the Justice Lab at Columbia University.