2019
DOI: 10.1137/18m119046x
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Subgradient-Free Stochastic Optimization Algorithm for Nonsmooth Convex Functions over Time-Varying Networks

Abstract: In this paper we consider a distributed stochastic optimization problem without the gradient/subgradient information for the local objective functions, subject to local convex constraints. The objective functions may be non-smooth and observed with stochastic noises, and the network for the distributed design is time-varying.By adding the stochastic dithers into the local objective functions and constructing the randomized differences motivated by the Kiefer-Wolfowitz algorithm, we propose a distributed subgra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
16
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 35 publications
(16 citation statements)
references
References 33 publications
0
16
0
Order By: Relevance
“…One of the most common models in the distributed optimization literature is a time-varying network model that is represented by a time-varying graph (Definition 5). For this graph, the most common assumption is that there is a constant M such that the union graph associated with all time intervals [n, n + M ] is strongly connected [32,1,34,15]. A network with this property is typically called uniformly strongly connected [21], M -strongly connected [22,28] or jointly strongly connected [33].…”
Section: Network Models In the Literaturementioning
confidence: 99%
See 2 more Smart Citations
“…One of the most common models in the distributed optimization literature is a time-varying network model that is represented by a time-varying graph (Definition 5). For this graph, the most common assumption is that there is a constant M such that the union graph associated with all time intervals [n, n + M ] is strongly connected [32,1,34,15]. A network with this property is typically called uniformly strongly connected [21], M -strongly connected [22,28] or jointly strongly connected [33].…”
Section: Network Models In the Literaturementioning
confidence: 99%
“…In ref. [16,22,32,28,34,14,3,15] the objective is that agents come to a consensus on one global optimization variable to minimize the sum of real-valued functions, each of which associated with one of the local agents. Although such consensus-type problems might appear quite different to (1.1), it turns out that an algorithm for (1.1) can also find a solution for consensus problems after a minor reformulation at the cost of additional communication, which we discuss in [27].…”
Section: Network Models In the Literaturementioning
confidence: 99%
See 1 more Smart Citation
“…As a result, distributed algorithms have attracted much research attention. Particularly, distributed optimization, which agents over the network cooperately seeks a global optimal solution, has become more and more popular [3,13,14,16,17].…”
Section: Introductionmentioning
confidence: 99%
“…Here, we assume that u i,k and ξ i,k , ∀i ∈ [n], k ≥ 1 are mutually independent, which is commonly assumed in stochastic optimization, e.g., [18], [22], [24], [32]- [35], [39], [40], [47], [48], [50], [54], [55], [60]. Let L k denote the σ-algebra generated by the independent random variables…”
mentioning
confidence: 99%