학술논문

Generalized Gradient Flows With Provable Fixed-Time Convergence and Fast Evasion of Non-Degenerate Saddle Points
Document Type
Periodical
Source
IEEE Transactions on Automatic Control IEEE Trans. Automat. Contr. Automatic Control, IEEE Transactions on. 69(4):2281-2293 Apr, 2024
Subject
Signal Processing and Analysis
Convergence
Heuristic algorithms
Stability criteria
Dynamical systems
Cost function
Training
Neural networks
Accelerated optimization
continuous-time optimization
fixed-time convergence
minimax problem
saddle point evasion
Language
ISSN
0018-9286
1558-2523
2334-3303
Abstract
Gradient-based first-order convex optimization algorithms find widespread applicability in a variety of domains, including machine learning tasks. Motivated by the recent advances in fixed-time stability theory of continuous-time dynamical systems, we introduce a generalized framework for designing accelerated optimization algorithms with strongest convergence guarantees that further extend to a subclass of nonconvex functions. In particular, we introduce the GenFlow algorithm and its momentum variant that provably converge to the optimal solution of objective functions satisfying the Polyak–Łojasiewicz inequality in a fixed time. Moreover, for functions that admit nondegenerate saddle points, we show that for the proposed GenFlow algorithm, the time required to evade these saddle points is uniformly bounded for all initial conditions. Finally, for strongly convex–strongly concave minimax problems whose optimal solution is a saddle point, a similar scheme is shown to arrive at the optimal solution again in a fixed time. The superior convergence properties of our algorithm are validated experimentally on a variety of benchmark datasets.