Aubin, P.-C. (2024). Extending convexity and gradient descent: a framework for general costs. In International Centre for Scientific Culture “E. Majorana” School of Mathematics “G. Stampacchia” (Ed.), Advances in Nonlinear Analysis and Optimization: Book of Abstracts (pp. 3–4).
E105-04 - Forschungsbereich Variationsrechnung, Dynamische Systeme und Operations Research
-
Published in:
Advances in Nonlinear Analysis and Optimization: Book of Abstracts
-
Volume:
Abstract Book
-
Date (published):
23-May-2024
-
Event name:
Advances in Nonlinear Analysis and Optimization (NAO 2024)
en
Event date:
23-May-2024 - 29-May-2024
-
Event place:
Erice, Scicily, Italy
-
Number of Pages:
2
-
Keywords:
Gradient Descent; General Cost Fundtion; Optimization
en
Abstract:
I will present my recent research in going beyond the quadratic cost in optimization, by replacing it with a general cost function and using a majorize-minimize framework. With Flavien Léger (INRIA Paris), we unveiled in https://arxiv.org/abs/2305.04917 a new class of gradient-type optimization methods that extends vanilla gradient descent, mirror descent, Riemannian gradient descent, and natural gradient descent, while keeping the same proof ideas and rates of convergence. Our approach involves constructing a surrogate for the objective function in a systematic manner, based on a chosen cost function. This surrogate is then minimized using an alternating minimization scheme. Using notions from optimal transport theory we establish convergence rates based on generalized notions of L-smoothness and convexity. We provide local versions of these two notions when the cost satisfies a condition known as nonnegative cross-curvature. In particular our framework provides the first global rates for natural gradient descent and Newton's method.
en
Project title:
Unilateralität und Asymmetrie in der Variationsanalyse: P 36344N (FWF - Österr. Wissenschaftsfonds)