2020
DOI: 10.1007/s00477-020-01874-1
|View full text |Cite
|
Sign up to set email alerts
|

Support vector regression optimized by meta-heuristic algorithms for daily streamflow prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 97 publications
(27 citation statements)
references
References 106 publications
0
27
0
Order By: Relevance
“…Accordingly, the original SVM algorithm is extended for nonlinear classification via the use of kernel functions [81]. Kernel functions, which are often nonlinear, map the space of input variables into a much higher-dimensional space, where separable hyperplanes are more likely to exist [82,83]. In addition, soft margin is proposed to handle the case the hyperplane does not exist even in the higher-dimensional space [53].…”
Section: Support Vector Machine (Svm) For Pattern Classificationmentioning
confidence: 99%
“…Accordingly, the original SVM algorithm is extended for nonlinear classification via the use of kernel functions [81]. Kernel functions, which are often nonlinear, map the space of input variables into a much higher-dimensional space, where separable hyperplanes are more likely to exist [82,83]. In addition, soft margin is proposed to handle the case the hyperplane does not exist even in the higher-dimensional space [53].…”
Section: Support Vector Machine (Svm) For Pattern Classificationmentioning
confidence: 99%
“…It enthused their basic concept from the behaviors of bird flocks. The PSO could be used in different fields of optimization, such as multiple-objective optimization, nonlinear and stochastic problems (Malik et al, 2020d;Tikhamarine et al, 2019Tikhamarine et al, , 2020. The working assembly of PSO could be summarized in the following steps:…”
Section: Particle Swarm Optimizationmentioning
confidence: 99%
“…The SVR has good generalization capability than many other ML algorithms (Malik et al, 2020d;Panahi et al, 2020). It is also highly robust to outliers (Borji et al, 2016;Qasem et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…The main reason that the ANN may not be able to capture extreme values is the unavailability of the high number of extreme values in the training data, and hence the ANN cannot adequately learn the process in respect of extremes (Adnan et al, 2019a). The other main reason may be the fact that the ranges of extreme values in the training data are lower than those of the validation and/or test data (Adnan et al, 2019a;Malik et al, 2020). This leads to extrapolation difficulties in machine learning models (Kisi and Aytek, 2013;Kisi and Parmar, 2016).…”
Section: Comparative Performance Of Machine Learning Modelsmentioning
confidence: 99%
“…The test results showed that the LSSVM and MARS-based models provide more accurate prediction results compared to OP-ELM and M5Tree models. Malik et al (2020) optimized SVM by six meta-heuristic algorithms to predict daily streamflow in Naula watershed, India. They reported that SVM optimized by Harris Hawks optimization algorithm showed superior performance to the other optimized SVM models.…”
Section: Introductionmentioning
confidence: 99%