BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//MIT Statistics and Data Science Center - ECPv5.14.2.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:MIT Statistics and Data Science Center
X-ORIGINAL-URL:https://stat.mit.edu
X-WR-CALDESC:Events for MIT Statistics and Data Science Center
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20190310T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20191103T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20190906T110000
DTEND;TZID=America/New_York:20190906T120000
DTSTAMP:20220528T132921
CREATED:20190802T174455Z
LAST-MODIFIED:20190903T133425Z
UID:3376-1567767600-1567771200@stat.mit.edu
SUMMARY:GANs\, Optimal Transport\, and Implicit Density Estimation
DESCRIPTION:Abstract: \nWe first study the rate of convergence for learning distributions with the adversarial framework and Generative Adversarial Networks (GANs)\, which subsumes Wasserstein\, Sobolev\, and MMD GANs as special cases. We study a wide range of parametric and nonparametric target distributions\, under a collection of objective evaluation metrics. On the nonparametric end\, we investigate the minimax optimal rates and fundamental difficulty of the implicit density estimation under the adversarial framework. On the parametric end\, we establish a theory for general neural network classes\, that characterizes the interplay on the choice of generator and discriminator. We investigate how to obtain a good statistical guarantee for GANs through the lens of regularization. We discover and isolate a new notion of regularization\, called the generator/discriminator pair regularization\, that sheds light on the advantage of GANs compared to classical approaches for density estimation. We develop novel oracle inequalities as the main tools for analyzing GANs\, which is of independent theoretical interest. \nLater\, we proceed to discuss optimal transport\, estimating under the Wasserstein metric\, and how to use them for implicit density estimation. We will point out an interesting connection between pair regularization and optimal transport.\n\n\nBiography: \nDr. Liang is an assistant professor at Chicago Booth. He is also the George C. Tiao faculty fellow in data science research. His current research interests include computational and algorithmic aspects of statistical inference\, machine learning and statistical learning theory\, stochastic methods in non-convex optimization. \nThe MIT Statistics and Data Science Center hosts guest lecturers from around the world in this weekly seminar.
URL:https://stat.mit.edu/calendar/liang/
LOCATION:E18-304\, United States
CATEGORIES:Stochastics and Statistics Seminar
GEO:42.3620185;-71.0878444
END:VEVENT
END:VCALENDAR