2018
DOI: 10.1080/02331934.2018.1482491
|View full text |Cite
|
Sign up to set email alerts
|

An incremental mirror descent subgradient algorithm with random sweeping and proximal step

Abstract: We investigate the convergence properties of incremental mirror descent type subgradient algorithms for minimizing the sum of convex functions. In each step, we only evaluate the subgradient of a single component function and mirror it back to the feasible domain, which makes iterations very cheap to compute. The analysis is made for a randomized selection of the component functions, which yields the deterministic algorithm as a special case. Under supplementary differentiability assumpt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
16
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(17 citation statements)
references
References 12 publications
(28 reference statements)
1
16
0
Order By: Relevance
“…As noted in [10], this scheme generalizes the classical subgradient method and is close to the subgradient projection algorithm.…”
Section: Preliminariesmentioning
confidence: 85%
See 4 more Smart Citations
“…As noted in [10], this scheme generalizes the classical subgradient method and is close to the subgradient projection algorithm.…”
Section: Preliminariesmentioning
confidence: 85%
“…Taking into consideration this remark, only the maximum in the construction (2) needs to be attained in the case of such compositions when the involved functions are proper, convex and semicontinuous, and the operators linear, in which case we say that they fulfill the property (2 ′ ). Unlike the construction proposed in [10], our approach is flexible enough to allow modifying Algorithm 3.10 in order to solve such problems as well.…”
Section: Algorithm 310mentioning
confidence: 99%
See 3 more Smart Citations