학술논문

The Nesterov-Spokoiny Acceleration Achieves Strict $o(1/k^2)$ Convergence
Document Type
Working Paper
Source
Subject
Mathematics - Optimization and Control
Language
Abstract
We study a variant of an accelerated gradient algorithm of Nesterov and Spokoiny. We call this algorithm the Nesterov--Spokoiny Acceleration (NSA). The NSA algorithm simultaneously satisfies the following property: For a smooth convex objective $f \in \mathscr{F}_{L}^{\infty,1} (\mathbb{R}^n) $, the sequence $\{ \mathbf{x}_k \}_{k \in \mathbb{N}}$ governed by NSA satisfies $ \limsup\limits_{k \to \infty } k^2 ( f (\mathbf{x}_k ) - f^* ) = 0 $, where $f^* > -\infty$ is the minimum of $f$.