- This event has passed.
Towards Robust Statistical Learning Theory
October 18 @ 11:00 am - 12:00 pm
Stanislav Minsker (USC)
Real-world data typically do not fit statistical models or satisfy assumptions underlying the theory exactly, hence reducing the number and strictness of these assumptions helps to lessen the gap between the “mathematical” world and the “real” world. The concept of robustness, in particular, robustness to outliers, plays the central role in understanding this gap. The goal of the talk is to introduce the principles and robust algorithms based on these principles that can be applied in the general framework of statistical learning theory. These algorithms avoid explicit (and often bias-producing) outlier detection and removal, instead taking advantage of induced symmetries in the distribution of the data.
I will discuss uniform deviation bounds for the mean estimators of heavy-tailed distributions and applications of these bounds to robust empirical risk minimization.
Implications of proposed techniques for logistic regression and regression with quadratic loss will be highlighted.
This talk is partially based on a joint work with Timothée Mathieu.
Stanislav Minsker is currently an Assistant Professor in the Department of Mathematics at the University of Southern California. He received B.Sc. in Mathematics from the Novosibirsk State University in 2007 and Ph.D. in Mathematics from the Georgia Institute of Technology in 2012. Prior to joining USC, he was a Visiting Assistant Professor at Duke University and worked in Quantitative Analytics at Wells Fargo Securities. His main research interests are in the areas of statistical learning theory, robust statistics, and concentration of measure inequalities.