2020
DOI: 10.1109/tac.2019.2936355
|View full text |Cite
|
Sign up to set email alerts
|

A Smooth Double Proximal Primal-Dual Algorithm for a Class of Distributed Nonsmooth Optimization Problems

Abstract: This technical note studies a class of distributed nonsmooth convex consensus optimization problem. The cost function is a summation of local cost functions which are convex but nonsmooth. Each of the local cost functions consists of a twice differentiable (smooth) convex function and two lower semi-continuous (nonsmooth) convex functions. We call this problem as single-smooth plus double-nonsmooth (SSDN) problem. Under mild conditions, we propose a distributed double proximal primal-dual optimization algorith… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 27 publications
(13 citation statements)
references
References 32 publications
0
13
0
Order By: Relevance
“…For the nonsmooth resource allocation problem (4), consider the case where the communication topology is a strongly connected and weight-balanced digraphs. If Assumptions 1 and 2 hold, with the initial condition satisfying N i=1 w i (0) = 0 n , then (y * , x * , s * , w * ) is an equilibrium point of (10), if and only if y * is an optimal solution of (4).…”
Section: B Convergence Analysis Of Algorithmmentioning
confidence: 99%
“…For the nonsmooth resource allocation problem (4), consider the case where the communication topology is a strongly connected and weight-balanced digraphs. If Assumptions 1 and 2 hold, with the initial condition satisfying N i=1 w i (0) = 0 n , then (y * , x * , s * , w * ) is an equilibrium point of (10), if and only if y * is an optimal solution of (4).…”
Section: B Convergence Analysis Of Algorithmmentioning
confidence: 99%
“…Let g k = ∇g(x k ) + e k+1 . It follows from (8) and the definition of inexact proximal operator (3) that…”
Section: Assumption Iii2 Consider the Undirected Time-varying Network...mentioning
confidence: 99%
“…One fundamental model for distributed non-smooth non-convex optimization, arising from optimization problems such as Lasso [1] , SVM [2] , and optimizing neural networks [3] , is that each local objective function of a node is the summation of a (non-convex) differentiable function and a non-smooth convex function (l 1 norm or indicator function). Although the research on distributed optimization has made significant progress on non-smooth convex problems [4]- [8] , distributed non-smooth non-convex optimization is still challenging.…”
Section: Introductionmentioning
confidence: 99%
“…Definition 2 ( -Lipschitz 36,37 ). A function f (⋅) ∶ S → R n is Lipschitz with constant > 0, or simply -Lipschitz over the set S if…”
Section: Convex Analysismentioning
confidence: 99%
“…f i (x i ) and g i (x i ) represent the quadratic objective and the l 1 penalty with the anchor 0 for each agent i, respectively. 36 Obviously, g i (x) (i = 1, 2, … , 6) is nondifferentiable. f i (x) (i = 1, 2, … , 6) is m i -strongly convex and its gradient satisfies i -Lipschitz condition.…”
Section: An Example For Optimal Consensusmentioning
confidence: 99%