2018
DOI: 10.1109/jproc.2018.2817461
|View full text |Cite
|
Sign up to set email alerts
|

Network Topology and Communication-Computation Tradeoffs in Decentralized Optimization

Abstract: In decentralized optimization, nodes cooperate to minimize an overall objective function that is the sum (or average) of per-node private objective functions. Algorithms interleave local computations with communication among all or a subset of the nodes. Motivated by a variety of applications-decentralized estimation in sensor networks, fitting models to massive data sets, and decentralized control of multi-robot systems, to name a fewsignificant advances have been made towards the development of robust, pract… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
329
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 428 publications
(339 citation statements)
references
References 111 publications
4
329
0
Order By: Relevance
“…Comparing (43), (42) and (44) with the KKT conditions (14), (15) and (16), we conclude that the triple (x k , z k , φ k ) satisfies the KKT conditions when k goes to infinity. Next, we show that {(x k , z k , φ k )} converges when k → ∞.…”
Section: Appendix B Proof Of Theoremmentioning
confidence: 84%
“…Comparing (43), (42) and (44) with the KKT conditions (14), (15) and (16), we conclude that the triple (x k , z k , φ k ) satisfies the KKT conditions when k goes to infinity. Next, we show that {(x k , z k , φ k )} converges when k → ∞.…”
Section: Appendix B Proof Of Theoremmentioning
confidence: 84%
“…In this section we review the distributed subgradient method that has been proposed in the pioneering works [11,12] (see also the tutorial papers [2,3,4]). In this survey, we report a proof based on the analysis proposed in the references above.…”
Section: Distributed Subgradient Methodsmentioning
confidence: 99%
“…It also discusses tailored, parallel and distributed optimization algorithms based on decomposition techniques and including also the distributed subgradient method. Recent surveys analyze thoroughly average consensus [2] and the distributed subgradient method [2,3,4], with a literature review on other distributed optimization techniques. The book [5] provides parallel and distributed asynchronous optimization algorithms, including gradient tracking techniques.…”
Section: Scope Of the Monographmentioning
confidence: 99%
See 1 more Smart Citation
“…One basic research problem is the leaderless consensus problem, where the agents achieve a common value of interest by interacting with their local neighbors. The consensus algorithm initiates the research trend in the area and triggers a lot of applications including formation [4], distributed optimization [5], synchronization of biochemical networks [6], and cooperative adaptive identification [7].…”
Section: Introductionmentioning
confidence: 99%