2020
DOI: 10.4018/ijsir.2020070102
|View full text |Cite
|
Sign up to set email alerts
|

Dead Sea Water Levels Analysis Using Artificial Neural Networks and Firefly Algorithm

Abstract: In this study, the performance of adaptive multilayer perceptron neural network (MLPNN) for predicting the Dead Sea water level is discussed. Firefly Algorithm (FFA), as an optimization algorithm is used for training the neural networks. To propose the MLPNN-FFA model, Dead Sea water levels over the period 1810–2005 are applied to train MLPNN. Statistical tests evaluate the accuracy of the hybrid MLPNN-FFA model. The predicted values of the proposed model were compared with the results obtained by another meth… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 22 publications
0
5
0
Order By: Relevance
“…In this study, we used several types of accuracy criteria: error functions. There are another statistical criteria that can also use such as the Akaike information criterion (AIC, AICs), and the Bayesian information criterion (BIC) [31][32][33]. The MAPE criterion is one the error function that we have used, where the function is given as…”
Section: Accuracy Criteriamentioning
confidence: 99%
“…In this study, we used several types of accuracy criteria: error functions. There are another statistical criteria that can also use such as the Akaike information criterion (AIC, AICs), and the Bayesian information criterion (BIC) [31][32][33]. The MAPE criterion is one the error function that we have used, where the function is given as…”
Section: Accuracy Criteriamentioning
confidence: 99%
“…MLPNN is a feed-forward ANN with three layers (input layer, hidden layers, and output layer), as shown in Figure 2 [39][40][41][42]. In this study, we used a single hidden layer that contains ten hidden neurons and the hidden activation function is the sigmoid function, as determined by:…”
Section: Annmentioning
confidence: 99%
“…Multilayer perceptron neural network (MLPNN) is a feed-forward neural network with three types of layers (input layer, hidden layers, and output layer), as shown in Fig. 1 [25,26]. In this study, we have used one hidden layer with ten hidden neurons, and the hidden activation function (sigmoid function), that is defined in the following equation.…”
Section: Structurementioning
confidence: 99%