학술논문

Transient Growth of Accelerated Optimization Algorithms
Document Type
Periodical
Source
IEEE Transactions on Automatic Control IEEE Trans. Automat. Contr. Automatic Control, IEEE Transactions on. 68(3):1823-1830 Mar, 2023
Subject
Signal Processing and Analysis
Transient analysis
Optimization
Eigenvalues and eigenfunctions
Convergence
Heuristic algorithms
Linear systems
Transient response
Convex optimization
first-order optimization algorithms
heavy-ball method
integral quadratic constraints (IQCs)
Nesterov’s accelerated method
nonasymptotic behavior
nonnormal matrices
transient growth
Language
ISSN
0018-9286
1558-2523
2334-3303
Abstract
Optimization algorithms are increasingly being used in applications with limited time budgets. In many real-time and embedded scenarios, only a few iterations can be performed and traditional convergence metrics cannot be used to evaluate performance in these nonasymptotic regimes. In this article, we examine the transient behavior of accelerated first-order optimization algorithms. For convex quadratic problems, we employ tools from linear systems theory to show that transient growth arises from the presence of nonnormal dynamics. We identify the existence of modes that yield an algebraic growth in early iterations and quantify the transient excursion from the optimal solution caused by these modes. For strongly convex smooth optimization problems, we utilize the theory of integral quadratic constraints to establish an upper bound on the magnitude of the transient response of Nesterov’s accelerated algorithm. We show that both the Euclidean distance between the optimization variable and the global minimizer and the rise time to the transient peak are proportional to the square root of the condition number of the problem. Finally, for problems with large condition numbers, we demonstrate tightness of the bounds that we derive up to constant factors.