
- This event has passed.
Finite-Particle Convergence Rates for Stein Variational Gradient Descent
March 7 @ 11:00 am - 12:00 pm
Krishna Balasubramanian, University of California - Davis
E18-304
Event Navigation
Abstract:
Stein Variational Gradient Descent (SVGD) is a deterministic, interacting particle-based algorithm for nonparametric variational inference, yet its theoretical properties remain challenging to fully understand. This talk presents two complementary perspectives on SVGD. First, we introduce Gaussian-SVGD, a framework that projects SVGD onto the family of Gaussian distributions using a bilinear kernel. We establish rigorous convergence results for both mean-field dynamics and finite-particle systems, proving linear convergence to equilibrium in strongly log-concave settings. This framework also unifies recent algorithms for Gaussian Variational Inference (GVI) under a single theoretical lens. Second, we examine the finite-particle convergence rates of nonparametric SVGD in Kernelized Stein Discrepancy (KSD) and Wasserstein-2 metrics. By decomposing the time derivative of relative entropy, we derive near-optimal convergence rates with polynomial dependence on dimensionality for certain kernel families. We also outline a framework to compare deterministic SVGD algorithms to the more standard randomized MCMC algorithms.
Bio:
Krishna Balasubramanian is an Associate Professor in the Department of Statistics, University of California, Davis. He is also affiliated with the Graduate Group in Applied Mathematics, Graduate Program in Electrical and Computer Engineering, the Center for Data Science and Artificial Intelligence Research (CeDAR) and the TETRAPODS Institute of Data Science at UC Davis. He was a visiting scientist at the Simons Institute for the Theory of Computing, UC Berkeley in Fall 2021 and 2022. Previously, he completed his PhD in Computer Science from Georgia Institute of Technology and was a postdoctoral researcher in the Department of Operations Research and Financial Engineering, Princeton University, and the Department of Statistics at UW-Madison. Krishna’s research interests include stochastic optimization and sampling, deep learning, nonparametric, geometric and topological Statistics. He has received a Facebook fellowship award, ICML best paper runner-up awardand INFORMS ICS Prize. He serves as an associate editor for the Annals of Statistics, IEEE Transactions on Information Theory, Journal of Machine Learning Research, and as a senior area chair for top machine learning conferences including the International Conference on Machine Learning (ICML), Advances in Neural Information Processing Systems (NeurIPS), International Conference on Learning Representations (ICLR), and Conference on Learning Theory (COLT).