학술논문

Distributed regularized primal-dual method
Document Type
Conference
Source
2016 IEEE Global Conference on Signal and Information Processing (GlobalSIP) Signal and Information Processing (GlobalSIP), 2016 IEEE Global Conference on. :540-544 Dec, 2016
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Signal Processing and Analysis
Optimization
Convergence
Linear programming
Upper bound
Distributed algorithms
Algorithm design and analysis
Decision feedback equalizers
Language
Abstract
We study a deterministic primal-dual subgradient method for distributed optimization of a separable objective function with global inequality constraints. To control the norm of dual variables, we augment the Lagrangian function with a regularizer on dual variables. Specifically, we show that under a certain restriction on the step size of the underlying algorithm, the norm of dual variables is inversely proportional to the regularizer's curvature. We leverage this result to bound the consensus terms and subsequently establish the convergence rate of the distributed primal-dual algorithm. We also establish an asymptotic decay rate on the norm of the constraint violation. We exhibit a tension between the convergence rate of the underlying algorithm and the decay rate associated with the constraint violation. Specifically, we show that when one of the inequality constraints is binding at optimal solutions, improving the convergence rate in the objective value deteriorates the decay rate of the constraint violation bound and vice versa. However, in the case that one optimal solution satisfies the inequality constraints strictly, the tension vanishes.