2015
DOI: 10.1016/j.neunet.2014.09.009
|View full text |Cite
|
Sign up to set email alerts
|

A one-layer recurrent neural network for constrained nonconvex optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
13
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 83 publications
(13 citation statements)
references
References 38 publications
0
13
0
Order By: Relevance
“…where γ > 0 is a penalty parameter, and sgn(•) is the sign function. Then, inspired by [14,22], we construct the following feedback neural network for minimizing E(x)ẋ = − ∇E(x)…”
Section: Problem Statement and Model Descriptionmentioning
confidence: 99%
See 2 more Smart Citations
“…where γ > 0 is a penalty parameter, and sgn(•) is the sign function. Then, inspired by [14,22], we construct the following feedback neural network for minimizing E(x)ẋ = − ∇E(x)…”
Section: Problem Statement and Model Descriptionmentioning
confidence: 99%
“…Such a penalty γ can strengthen the constraints and make the results converge into the constraints more efficiently in optimization problems with inequality constraints. In addition, as we will show later, the penalty parameter γ makes the neural network feasible to fit in the network model for solving some nonconvex problems in [22].…”
Section: Problem Statement and Model Descriptionmentioning
confidence: 99%
See 1 more Smart Citation
“…The sparse constraint causes GSNMF to be a nonconvex nonsmooth problem, and traditional optimization algorithms can not be optimized directly. Recently, numerous neural networks have emerged as a powerful tool for optimization problems [14][15][16][17][18][19][20][21][22][23][24][25][26][27]. For some nonconvex problems, an inertial projection neural network (IPNN) [16] has been proposed to search different local optimal solutions by the inertial term.…”
Section: Introductionmentioning
confidence: 99%
“…Based on the SFLA framework, the global optimal solution can be searched. Moreover, there are many optimization methods for nonconvex nonsmooth problems that use neural networks [22][23][24][25][26][27].…”
Section: Introductionmentioning
confidence: 99%