- This event has passed.
Learning and estimation: separated at birth, reunited at last
April 11, 2014 @ 11:00 am
Alexander Rakhlin (University of Pennsylvania, The Wharton School)
Abstract: We consider the problem of regression in three scenarios: (a) random design under the assumption that the model F is correctly specified, (b) distribution-free statistical learning with respect to a reference class F; and (c) online regression with no assumption on the generative process. The first problem is often studied in the literature on nonparametric estimation, the second falls within the purview of statistical learning theory, and the third is studied within the online learning community. It is recognized that complexity of the class F plays the key role in determining the minimax behavior: the importance of entropy in the study of estimation goes back to Le Cam, Ibragimov and Khas’minskii, and Birge; within the setting of statistical learning the importance of entropy was established in the work of Vapnik and Chervonenkis and in subsequent works on uniform LLN within empirical process theory. The corresponding complexities for online learning have only been found in the past few years. But do these three problems really differ from the minimax point of view? The question, which boils down to understanding well-specified and misspecified models, will be addressed in this talk.
Joint work with Karthik Sridharan and Sasha Tsybakov