2009
DOI: 10.1016/j.eswa.2008.09.053
|View full text |Cite
|
Sign up to set email alerts
|

Generalization performance of support vector machines and neural networks in runoff modeling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
82
0
5

Year Published

2010
2010
2021
2021

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 229 publications
(91 citation statements)
references
References 18 publications
0
82
0
5
Order By: Relevance
“…For example, Behzad et al (2009) reported that SVMs are able to generalize better than ANNs, though there is still some danger of under-or overfitting to the training data (Han et al, 2007) (true to some extent of virtually any model). The SVMs are also able to learn from a much smaller training set than ANNs, and the global minimum of the linear optimization is easily obtainable, whereas there is a risk of becoming trapped in a local minimum of the non-linear ANN objective function (Behzad et al, 2009). …”
Section: Approaches To Hydrological Modellingmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, Behzad et al (2009) reported that SVMs are able to generalize better than ANNs, though there is still some danger of under-or overfitting to the training data (Han et al, 2007) (true to some extent of virtually any model). The SVMs are also able to learn from a much smaller training set than ANNs, and the global minimum of the linear optimization is easily obtainable, whereas there is a risk of becoming trapped in a local minimum of the non-linear ANN objective function (Behzad et al, 2009). …”
Section: Approaches To Hydrological Modellingmentioning
confidence: 99%
“…Han et al (2007) compared the RBF and linear kernel functions and found that even within the same catchment, the ideal function can change under different circumstances. Despite being a relatively new approach to hydrological modelling, SVMs and SVRs have been applied to many of the same problems as ANNs, including rainfallrunoff modelling for water resources planning and flood forecasting at various lead-times (Asefa et al, 2006;Han et al, 2007;Behzad et al, 2009;Rasouli et al, 2012), hydraulic modelling (Liong and Sivapragasam, 2002) and downscaling of GCM output (Tripathi et al, 2006). Genetic Programming (GP; Koza, 1992) is another soft computing approach to non-linear modelling and is based on Darwin's theory of evolution by natural selection.…”
Section: Approaches To Hydrological Modellingmentioning
confidence: 99%
“…The basic principle of learning in SVM is that it searches for an optimal hyperplane which satisfies the request of classification, then uses an algorithm to make the margin of the separation beside the optimal hyperplane maximum while ensuring the accuracy of correct classification (Yeh et al 2010). It produces a binary classifier, so-called optimal separating hyperplanes, and results in a uniquely global optimum, high generalization performance, and does not suffer from a local optima problem (Behzad et al, 2009). The principle of SVM can be described as follows.…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…Since the successfulness of SVM depends on the choice of kernel function K and hyper parameters, a cross-validation procedure should be performed for adjusting those parameters (Min and Lee, 2005;Behzad et al, 2009). Linear, polynomial, RBF, and exponential kernels were used in our experiments, where gamma coefficient for polynomial and RBF kernel was 0.0625, degree was 3, coefficient varied from 0 to 0.1, c=10.…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…Nevertheless, variations will appear in the accuracy of a pattern recognition algorithm when the training sample data are changed [11,12]. Therefore, the stability of the classification algorithms for different data sample sets has become an obstacle for the widespread application of PD UHF on-line monitoring for power transformers.…”
Section: Introductionmentioning
confidence: 99%