- This event has passed.
Asymptotics of learning on dependent and structured random objects
November 5, 2021 @ 11:00 am - 12:00 pm
Morgane Austern (Harvard University)
E18-304
Event Navigation
Abstract: Classical statistical inference relies on numerous tools from probability theory to study the properties of estimators. However, these same tools are often inadequate to study modern machine problems that frequently involve structured data (e.g networks) or complicated dependence structures (e.g dependent random matrices). In this talk, we extend universal limit theorems beyond the classical setting. Firstly, we consider distributionally “structured” and dependent random object i.e random objects whose distribution are invariant under the action of an amenable group. We show, under mild moment and mixing conditions, a series of universal second and third order limit theorems: central-limit theorems, concentration inequalities, Wigner semi-circular law and Berry-Esseen bounds. The utility of these will be illustrated by a series of examples in machine learning, network and information theory. Secondly by building on these results, we establish the asymptotic distribution of the cross-
validated risk with the number of folds allowed to grow at an arbitrary rate. Using this, we study the statistical speed-up of cross validation compared to a train-test split procedure, which reveals surprising results even when used on simple estimators.
——————–
Bio: Morgane Austern is an assistant professor at Harvard University in the statistics department. Her research focuses on problems in probability and statistics that are motivated by machine learning. She graduated with a PhD in statistics from Columbia University in 2019 where she worked in collaboration with Peter Orbanz and Arian Maleki on limit theorems for dependent and structured data. Previously, she was also a postdoctoral researcher at Microsoft Research New England.