You are here

A Tractable Barycenter for Probability Measures in Machine Learning

Event Category:
Reading Seminar on Mathematics of Machine Learning
Speaker:
Lénaïc Chizat
Institution:
EPFL
Webpage:

We introduce a formulation of entropic Wasserstein barycenters that enjoys favorable optimization, approximation, and statistical properties. This barycenter is defined as the unique probability measure that minimizes the sum of entropic optimal transport costs with respect to a family of given probability measures, plus an entropy term. We show that (i) this notion of barycenter is debiased, in the sense that it is a better approximation of the unregularized Wasserstein barycenter than the naive entropic Wasserstein barycenter (ii) it can be estimated efficiently from samples (as measured in relative entropy) and (iii) it lends itself naturally to a grid-free optimization algorithm which, in the mean-field limit, converges globally at an exponential rate.

Ref: https://arxiv.org/abs/2303.11844

Friday, March 31, 2023 - 10:00am
Zoom URL: https://umass-amherst.zoom.us/j/91345147149