- This event has passed.
Generative Models and Compressed Sensing
November 17, 2017 @ 11:00 am - 12:00 pm
Alex Dimakis (University of Texas at Austin)
Abstract: The goal of compressed sensing is to estimate a vector from an under-determined system of noisy linear measurements, by making use of prior knowledge in the relevant domain. For most results in the literature, the structure is represented by sparsity in a well-chosen basis. We show how to achieve guarantees similar to standard compressed sensing but without employing sparsity at all. Instead, we assume that the unknown vectors lie near the range of a generative model, e.g. a GAN or a VAE. We show how the problems of image inpainting and super-resolution are special cases of our general framework.
We show how to generalize the RIP condition for generative models and that random gaussian measurement matrices have this property with high probability. A Lipschitz condition for the generative neural network is the key technical issue for our results.
Time permitting we will discuss follow-up work on how GANs can model causal structure in high-dimensional probability distributions. (Based on joint works with Ashish Bora, Ajil Jalal, Murat Kocaoglu, Christopher Snyder and Eric Price)
Biography: Alex Dimakis is an Associate Professor at the ECE department, University of Texas at Austin. He received his Ph.D. in 2008 from UC Berkeley working with Martin Wainwright and Kannan Ramchandran. He received an NSF Career award, a Google faculty research award and the Eli Jury dissertation award. He is the co-recipient of several best paper awards including the joint Information Theory and Communications Society Best Paper Award in 2012. He is currently serving as an associate editor for IEEE Transactions on Information Theory. His research interests include information theory, coding theory and machine learning.