Loading Events
Stochastics and Statistics Seminar

Consensus-based optimization and sampling

May 3 @ 11:00 am - 12:00 pm

Franca Hoffmann, California Institute of Technology

E18-304

Abstract: Particle methods provide a powerful paradigm for solving complex global optimization problems leading to highly parallelizable algorithms. Despite widespread and growing adoption, theory underpinning their behavior has been mainly based on meta-heuristics. In application settings involving black-box procedures, or where gradients are too costly to obtain, one relies on derivative-free approaches instead. This talk will focus on two recent techniques, consensus-based optimization and consensus-based sampling. We explain how these methods can be used for the following two goals: (i) generating approximate samples from a given target distribution, and (ii) optimizing a given objective function. They circumvent the need for gradients via Laplace’s principle. We investigate the properties of this family of methods in terms of various parameter choices and present an overview of recent advances in the field.

Bio: Prof. Franca Hoffmann’s research interests lie at the interface of model-driven and data-driven approaches. She works on the development and application of mathematical tools for partial differential equation (PDE) analysis and data analysis.
Broadly, Franca’s interests in the area of partial differential equations revolve around non-linear drift-diffusion equations, kinetic theory, many particle systems and their mean-field limits, gradient flows, entropy methods, optimal transport, functional inequalities, parabolic and hyperbolic scaling techniques and hypocoercivity. In the area of data analysis, Franca is working on graph-based learning, and the development and analysis of optimization and sampling algorithms. The use of graph Laplacians in graph-based learning allows for a rigorous mathematical analysis of unsupervised and semi-supervised learning algorithms, and their continuum counterparts can be studied using tools from PDE theory. Optimization and sampling are at the heart of parameter estimation and uncertainty quantification in Bayesian inference, and are used in many modern machine learning approaches.
Franca works at the intersection of these fields, not only exploring what mathematical analysis can do for applications, but also what applications can do for mathematics.

Franca obtained her master’s in mathematics from Imperial College London (UK) and holds a PhD from the Cambridge Centre for Analysis at University of Cambridge (UK). She held the position of von Kármán instructor at Caltech from 2017 to 2020, then joined University of Bonn (Germany) as Junior Professor and Quantum Leap Africa in Kigali, Rwanda (African Institute for Mathematical Sciences) as AIMS-Carnegie Research Chair in Data Science, before arriving at the California Institute of Technology as Assistant Professor in 2022.


MIT Statistics + Data Science Center
Massachusetts Institute of Technology
77 Massachusetts Avenue
Cambridge, MA 02139-4307
617-253-1764