2018
DOI: 10.1007/978-981-13-1592-3_41
|View full text |Cite
|
Sign up to set email alerts
|

Salp Swarm Algorithm (SSA) for Training Feed-Forward Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 39 publications
(13 citation statements)
references
References 14 publications
0
10
0
Order By: Relevance
“…In the last experiment, we compare our algorithm with 11 population-based trainers, namely particle swarm optimisation [11], artificial bee colony (ABC) [14], imperialist competitive algorithm (ICA) [15], firefly algorithm (FA) [18], grey wolf optimiser (GWO) [19], ant lion optimiser [21], dragonfly algorithm (DA) [22], sine cosine algorithm [23], whale optimisation algorithm (WOA) [24], grasshopper optimisation algorithm [25], and salp swarm algorithm (SSA) [56]. Algorithms such as PSO and ABC are among established training algorithms, while some others such as GOA and WOA are more recent.…”
Section: Resultsmentioning
confidence: 99%
“…In the last experiment, we compare our algorithm with 11 population-based trainers, namely particle swarm optimisation [11], artificial bee colony (ABC) [14], imperialist competitive algorithm (ICA) [15], firefly algorithm (FA) [18], grey wolf optimiser (GWO) [19], ant lion optimiser [21], dragonfly algorithm (DA) [22], sine cosine algorithm [23], whale optimisation algorithm (WOA) [24], grasshopper optimisation algorithm [25], and salp swarm algorithm (SSA) [56]. Algorithms such as PSO and ABC are among established training algorithms, while some others such as GOA and WOA are more recent.…”
Section: Resultsmentioning
confidence: 99%
“…In the current year 2019 Bairathi et al castoff SSA algorithm with MLP network for obtaining optimal set of weights and biases. The efficiency was scrutinized by using standard datasets and by comparing with some recent metaheuristic algorithms [45]. In 2018 Haidari et al proposed MLP network with Grasshopper Optimization algorithm.…”
Section: A Background and Related Workmentioning
confidence: 99%
“…It can be expressed as a disadvantage of deterministic approaches that adding extra hidden layers in the training of the network and being stuck in the local optimum depending on the first solution slows down the training of the network [33][34][35].Another technique used in training the network is stochastic algorithms [36]. Stochastic algorithms use randomness, which reduces the probability of getting stuck in the local minimum and makes it less dependent on the initial solution [37]. There are multiple types of stochastic algorithms, one of which is metaheuristic algorithms inspired by nature [38].…”
Section: Introductionmentioning
confidence: 99%