Events

15. May 2024, 16:30 until 18:30
SWM Colloquium : Optimizing my distributions and proving convergence, or how to look into the mirror, Pierre-Cyril Aubin, TU Wien
Many problems in machine learning and applied statistics can be formulated as optimizing a functional over the space of probability measures, e.g. the Kullback–Leibler (KL) divergence. But can we guarantee that we have a converging algorithm? Starting from Expectation-Maximization (EM), I will show that it can always be written as a mirror descent and present two cases, 1) the joint distribution is an exponential family and 2) we have a non-parametric distribution, but only over the latent space. In these cases, EM only involves convex functions and we have a (sub)linear convergence rate. Moving to variational inference in disguise, namely entropic optimal transport, I will then focus on the convergence of Sinkhorn's algorithm, a.k.a IPFP or RAS, outlining the similarities with EM.

Finally, I will show that both these algorithms fall within a general majorize-minimize framework for which we prove novel rates of convergence based on a five-point property introduced by Csiszár and Tusnády (in 1984).

The talk is based on joint works with Anna Korba (ENSAE, France) and Flavien Léger (INRIA Paris), see https://arxiv.org/abs/2305.04917 , Sections 1, 4.7 and 4.8 for an overview.