- This event has passed.
Scaling Limits of Neural Networks
November 8 @ 11:00 am - 12:00 pm
Boris Hanin, Princeton University
E18-304
Event Navigation
Abstract: Neural networks are often studied analytically through scaling limits: regimes in which taking to infinity structural network parameters such as depth, width, and number of training datapoints results in simplified models of learning. I will survey several such approaches with the goal of illustrating the rich and still not fully understood space of possible behaviors when some or all of the network’s structural parameters are large.
Bio:
Boris Hanin is an Assistant Professor at Princeton Operations Research and Financial Engineering working on deep learning, probability, and spectral asymptotics. Prior to Princeton, he was an Assistant Professor in Mathematics at Texas A&M and an NSF Postdoc at MIT Math. He is also an advisor and member of the technical staff at Foundry, an AI and computing startup.