Stochastics and Statistics Seminar
Finite-Particle Convergence Rates for Stein Variational Gradient Descent
Abstract: Stein Variational Gradient Descent (SVGD) is a deterministic, interacting particle-based algorithm for nonparametric variational inference, yet its theoretical properties remain challenging to fully understand. This talk presents two complementary perspectives on SVGD. First, we introduce Gaussian-SVGD, a framework that projects SVGD onto the family of Gaussian distributions using a bilinear kernel. We establish rigorous convergence results for both mean-field dynamics and finite-particle systems, proving linear convergence to equilibrium in strongly log-concave settings. This framework also unifies recent algorithms for…
Stochastics and Statistics Seminar
TBD
Stochastics and Statistics Seminar
TBD
Stochastics and Statistics Seminar
TBD
Stochastics and Statistics Seminar
TBD
How should we do linear regression?
Abstract: In the context of linear regression, we construct a data-driven convex loss function with respect to which empirical risk minimisation yields optimal asymptotic variance in the downstream estimation of the regression coefficients. Our semiparametric approach targets the best decreasing approximation of the derivative of the log-density of the noise distribution. At the population level, this fitting process is a nonparametric extension of score matching, corresponding to a log-concave projection of the noise distribution with respect to the Fisher divergence.…
Stochastics and Statistics Seminar
TBD