2017
DOI: 10.1007/s11269-017-1807-2
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Assessment of Artificial Neural Network, Generalized Regression Neural Network, Least-Square Support Vector Regression, and K-Nearest Neighbor Regression for Monthly Streamflow Forecasting in Linear and Nonlinear Conditions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
27
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 101 publications
(35 citation statements)
references
References 23 publications
1
27
0
Order By: Relevance
“…The second statement of Equation ( 6) is related to the penalty function of the difference between the actual output values and the model. Regarding the parameter C, it can be said that small or large values of this parameter in the statement of Equation ( 6) cause the simplicity or complexity of the model (Modaresi et al, 2018).…”
Section: Least-squares Support Vector Machine (Lssvm)mentioning
confidence: 99%
“…The second statement of Equation ( 6) is related to the penalty function of the difference between the actual output values and the model. Regarding the parameter C, it can be said that small or large values of this parameter in the statement of Equation ( 6) cause the simplicity or complexity of the model (Modaresi et al, 2018).…”
Section: Least-squares Support Vector Machine (Lssvm)mentioning
confidence: 99%
“…Another method that is being used to maximize available data is the “cross‐validation” technique. This method involves using a distinct test set to evaluate the model performance at different learning phases (Modaresi, Araghinejad, & Ebrahimi, ). In the cross‐validation method, the norm is to split the available data into three: training, testing, and validation sets (Kisi, Mansouri, & Hu, ).…”
Section: Model Development Processesmentioning
confidence: 99%
“…The amount of spread in this model is in the range of > 0, and should be determined by the user by the trial-anderror method; the typical amount is 1.0. The number of neurons in the RBF layer is equal to the number of input data; in the summation layer it is two neurons, the Dsummation neuron and the S-summation neuron (Modaresi et al 2017). More details and applications of the GRNN model are presented in Specht (1991), Yin et al (2016), Tayyab et al (2016) and Modaresi et al (2017).…”
Section: General Regression Neural Network (Grnn)mentioning
confidence: 99%
“…In the past two decades, ANN models have been widely employed to predict hydrological variables such as daily, weekly and monthly runoff (Rajurkar et al 2004, Siou et al 2012, Nigam et al 2014, Nanda et al 2016. Different types of ANN models, such as multi-layer perceptron (MLP) (Jain et al 2004, Riadet al 2004, Srinivasulu and Jain 2006, Rezaeian Zadeh et al 2010, Dhamge et al 2012, Rezaeianzadeh et al 2013, Kumar et al 2015, multi-layer back-propagation ANN (BPANN) (Agarwal and Singh 2004), back-propagation neural network (BPNN), radial basis function neural network (RBF) (Jayawardena and Fernando 1998, Lin and Chen 2004, Senthil Kumar et al 2005, Lee et al 2010, Dar 2017, feedforward back-propagation (FFBP) (Shiau and Hsu 2016), general regression neural network (GRNN) (Islam et al 2001, Cigizoglu and Alp 2004, Aytek and Alp 2008, Gowda and Mayya 2014, Mishra et al 2014, Tayyab et al 2016, Modaresi et al 2017 and rotated general regression neural network (RGRNN) (Irfan et al 2016, Yin et al 2016, have been applied for rainfall-runoff simulation. In recent years, dynamic ANN models have been suggested which are more efficient than static ANN models to obtain time series (Guzman et al 2017).…”
Section: Introductionmentioning
confidence: 99%