1987
DOI: 10.1007/bf02591740
|View full text |Cite
|
Sign up to set email alerts
|

A generalization of Polyak's convergence result for subgradient optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
25
0
5

Year Published

1990
1990
2008
2008

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 48 publications
(32 citation statements)
references
References 13 publications
2
25
0
5
Order By: Relevance
“…Under certain conditions on the step sizes, the convergence of the subgradient algorithm is well known [1,14,21]. In practice, however, it may be necessary to terminate the subgradient algorithm before optimality.…”
Section: Branch and Boundmentioning
confidence: 99%
See 1 more Smart Citation
“…Under certain conditions on the step sizes, the convergence of the subgradient algorithm is well known [1,14,21]. In practice, however, it may be necessary to terminate the subgradient algorithm before optimality.…”
Section: Branch and Boundmentioning
confidence: 99%
“…A convergence result for such a step size is given in [1,21]. The updated multipliers at iteration ic / 1 are:…”
Section: Lagrangean Dualmentioning
confidence: 99%
“…In order to simplify the notation we let a denote the vector corresponding to the coefficients in (5); x denote the vector corresponding to the binary decision variables; c denote the vector of costs, and T = {x: (2), (3), and (4)). Then the singly constrained assignment problem can be stated as P1 = min {cx : x E T and ax < b}, and it is well-known that /91 is NP-complete.…”
Section: (Ij)eementioning
confidence: 99%
“…Johnson et al [21] report that exact HK bounds have been computed by a special purposed linear programming code, for instances as large as 33810 cities. For even large scale instances, it is applied the subgradient method proposed on the original Held and Karp papers and speeded up by a number of algorithmic tricks [2,16,34,37,38]. Since for large scale instances the optimal solution is not known, the comparison of the heuristic and HK bounds is common practice.…”
Section: Introductionmentioning
confidence: 99%