2020
DOI: 10.1007/s00500-020-05080-7
|View full text |Cite
|
Sign up to set email alerts
|

Global-best optimization of ANN trained by PSO using the non-extensive cross-entropy with Gaussian gain

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 32 publications
0
1
0
Order By: Relevance
“…Using the DNN ensemble instead of a single DNN also minimizes, to some extent, the problem of convergence to local minima due to gradient descent optimization [3,4]. Alternative solutions to usage of ensembles for inducing variety in learning include the global optimization of network weights using evolutionary algorithms such as Particle Swarm Optimization [5].…”
Section: Introductionmentioning
confidence: 99%
“…Using the DNN ensemble instead of a single DNN also minimizes, to some extent, the problem of convergence to local minima due to gradient descent optimization [3,4]. Alternative solutions to usage of ensembles for inducing variety in learning include the global optimization of network weights using evolutionary algorithms such as Particle Swarm Optimization [5].…”
Section: Introductionmentioning
confidence: 99%