Aubin, P.-C. (2024, May 15). Optimizing my distributions and proving convergence, or how to look into the mirror [Presentation]. SWM Colloquium, Wien, Austria.
E105-04 - Forschungsbereich Variationsrechnung, Dynamische Systeme und Operations Research
-
Date (published):
15-May-2024
-
Event name:
SWM Colloquium
en
Event date:
15-May-2024
-
Event place:
Wien, Austria
-
Keywords:
Optimization; Gradient descent; Convergence
en
Abstract:
Many problems in machine learning and applied statistics can be formulated as optimizing a functional over the space of probability measures, e.g. the Kullback–Leibler (KL) divergence. But can we guarantee that we have a converging algorithm? Starting from Expectation-Maximization (EM), I will show that it can always be written as a mirror descent and present two cases, 1) the joint distribution is an exponential family and 2) we have a non-parametric distribution, but only over the latent space. In these cases, EM only involves convex functions and we have a (sub)linear convergence rate. Moving to variational inference in disguise, namely entropic optimal transport, I will then focus on the convergence of Sinkhorn's algorithm, a.k.a IPFP or RAS, outlining the similarities with EM.
Finally, I will show that both these algorithms fall within a general majorize-minimize framework for which we prove novel rates of convergence based on a five-point property introduced by Csiszár and Tusnády (in 1984).
The talk is based on joint works with Anna Korba (ENSAE, France) and Flavien Léger (INRIA Paris), see https://arxiv.org/abs/2305.04917 , Sections 1, 4.7 and 4.8 for an overview.
en
Project title:
Unilateralität und Asymmetrie in der Variationsanalyse: P 36344N (FWF - Österr. Wissenschaftsfonds)