- This event has passed.
Size-Independent Sample Complexity of Neural Networks
May 4 @ 11:00 am - 12:00 pm
Ohad Shamir (Weizman Institute)
Abstract: I’ll describe new bounds on the sample complexity of deep neural networks, based on the norms of the parameter matrices at each layer. In particular, we show how certain norms lead to the first explicit bounds which are fully independent of the network size (both depth and width), and are therefore applicable to arbitrarily large neural networks. These results are derived using some novel techniques, which may be of independent interest.
Joint work with Noah Golowich (Harvard) and Alexander Rakhlin (MIT)
Biography: Ohad Shamir is a faculty member in the Department of Computer Science and Applied Mathematics at the Weizmann Institute of Science, Israel. His research focuses on machine learning, with emphasis on algorithms which combine practical efficiency and theoretical insight. He is also interested in the many intersections of machine learning with related fields, such as optimization, statistics, theoretical computer science and AI. He has served as program co-chair of COLT 2017, and is currently a member of the COLT steering committee.