- This event has passed.
Inference in High Dimensions for (Mixed) Generalized Linear Models: the Linear, the Spectral and the Approximate
November 4, 2022 @ 11:00 am - 12:00 pm
Marco Mondelli, Institute of Science and Technology Austria
In a generalized linear model (GLM), the goal is to estimate a d-dimensional signal x from an n-dimensional observation of the form f(Ax, w), where A is a design matrix and w is a noise vector. Well-known examples of GLMs include linear regression, phase retrieval, 1-bit compressed sensing, and logistic regression. We focus on the high-dimensional setting in which both the number of measurements n and the signal dimension d diverge, with their ratio tending to a fixed constant. Linear and spectral methods are two popular solutions to obtain an initial estimate, which are also commonly used as a ‘warm start’ for other algorithms. In particular, the linear estimator is a data-dependent linear combination of the columns of the design matrix, and its analysis is quite simple; the spectral estimator is the principal eigenvector of a data-dependent matrix, whose spectrum exhibits a phase transition.
In this talk, I will start by discussing the emergence of this phase transition and provide precise asymptotics on the high-dimensional performance of the spectral method. Next, I will show how to optimally combine the linear and spectral estimators. Finally, I will add a ‘twist’ to the problem and consider the recovery of two signals from unlabeled data coming from a mixed GLM. Approximate message passing (AMP) algorithms (often used for high-dimensional inference tasks) will provide a powerful analytical tool to solve these problems.
Marco Mondelli received the B.S. and M.S. degree in Telecommunications Engineering from the University of Pisa, Italy, in 2010 and 2012, respectively. In 2016, he obtained his Ph.D. degree in Computer and Communication Sciences at the École Polytechnique Fédérale de Lausanne (EPFL), Switzerland. He is currently an Assistant Professor at the Institute of Science and Technology Austria (ISTA). Prior to that, he was a Postdoctoral Scholar in the Department of Electrical Engineering at Stanford University, USA, from February 2017 to August 2019. He was also a Research Fellow with the Simons Institute for the Theory of Computing, UC Berkeley, USA, for the program on Foundations of Data Science from August to December 2018. His research interests include data science, machine learning, information theory, and modern coding theory. He was the recipient of a number of fellowships and awards, including the Jack K. Wolf ISIT Student Paper Award in 2015, the STOC Best Paper Award in 2016, the EPFL Doctorate Award in 2018, the Simons-Berkeley Research Fellowship in 2018, the Lopez-Loreta Prize in 2019, and Information Theory Society Best Paper Award in 2021.