## You are here

A Tractable Barycenter for Probability Measures in Machine Learning

A Tractable Barycenter for Probability Measures in Machine Learning

We introduce a formulation of entropic Wasserstein barycenters that enjoys favorable optimization, approximation, and statistical properties. This barycenter is defined as the unique probability measure that minimizes the sum of entropic optimal transport costs with respect to a family of given probability measures, plus an entropy term. We show that (i) this notion of barycenter is debiased, in the sense that it is a better approximation of the unregularized Wasserstein barycenter than the naive entropic Wasserstein barycenter (ii) it can be estimated efficiently from samples (as measured in relative entropy) and (iii) it lends itself naturally to a grid-free optimization algorithm which, in the mean-field limit, converges globally at an exponential rate.

## Department of Mathematics and Statistics