2011
DOI: 10.1007/s00158-011-0634-y
|View full text |Cite
|
Sign up to set email alerts
|

A feasible directions method for nonsmooth convex optimization

Abstract: We propose a new technique for minimization of convex functions not necessarily smooth. Our approach employs an equivalent constrained optimization problem and approximated linear programs obtained with cutting planes. At each iteration a search direction and a step length are computed. If the step length is considered "non serious", a cutting plane is added and a new search direction is computed. This procedure is repeated until a "serious" step is obtained. When this happens, the search direction is a feasib… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2013
2013
2018
2018

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 26 publications
0
6
0
Order By: Relevance
“…Takezawa et al 18) and Brittain et al 5) solved problem (10) and problem (13), respectively, numerically and reported that the obtained optimal solutions have simple maximum eigenvalues. In contrast, Herskovits et al 11) found an optimal solution with fivefold maximum eigenvalue by using their algorithm for nonsmooth convex optimization. This section shows that, for truss structures, a series of simple problem instances can be constructed so that multiplicity of the optimal solution increases as the problem size increases.…”
Section: On Multiplicity Of Eigenvaluesmentioning
confidence: 97%
“…Takezawa et al 18) and Brittain et al 5) solved problem (10) and problem (13), respectively, numerically and reported that the obtained optimal solutions have simple maximum eigenvalues. In contrast, Herskovits et al 11) found an optimal solution with fivefold maximum eigenvalue by using their algorithm for nonsmooth convex optimization. This section shows that, for truss structures, a series of simple problem instances can be constructed so that multiplicity of the optimal solution increases as the problem size increases.…”
Section: On Multiplicity Of Eigenvaluesmentioning
confidence: 97%
“…These are the main ideas from NFDA; we need to understand the IED algorithm. For more on NFDA, see [9,10].…”
Section: Nfda For Convex Problemsmentioning
confidence: 99%
“…The IED method [1] combines the Nonsmooth Feasible Directions Algorithm (NFDA) for solving unconstrained nonsmooth convex problems, firstly presented in [9] and further studied in [10], with a certain DSG method.…”
Section: Introductionmentioning
confidence: 99%
“…In consideration of constrained optimization problem, the hybrid intelligent algorithm is developed: this new algorithm is combined and extended on the basis of Genetic Algorithm [12][13][14][15][16][17][18][19][20][21][22][23] and Zoutendijk Algorithm. [24][25][26][27] The characteristic is as follows: both the searching definiteness and randomness are taken into account. The multipoint searching and singlepoint searching are simultaneously carried through.…”
Section: Introductionmentioning
confidence: 99%
“…The essential step of dynamical optimization model is to design a new algorithm. In consideration of constrained optimization problem, the hybrid intelligent algorithm is developed: this new algorithm is combined and extended on the basis of Genetic Algorithm and Zoutendijk Algorithm . The characteristic is as follows: both the searching definiteness and randomness are taken into account.…”
Section: Introductionmentioning
confidence: 99%