2015 1st International Conference on Next Generation Computing Technologies (NGCT) 2015
DOI: 10.1109/ngct.2015.7375263
|View full text |Cite
|
Sign up to set email alerts
|

SVM and ANN: A comparative evaluation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…Second, the ANNs often overfit if training goes on too long, so for a given pattern, an ANN might start to consider the noise as part of the pattern [56,57]. SVMs do not suffer from either of these problems [16].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Second, the ANNs often overfit if training goes on too long, so for a given pattern, an ANN might start to consider the noise as part of the pattern [56,57]. SVMs do not suffer from either of these problems [16].…”
Section: Discussionmentioning
confidence: 99%
“…Among the various machine learning approaches, ANNs, Bayesian networks (BN) and support vector machines (SVMs) are the most frequently applied techniques [15,16]. These approaches are algorithms with significant importance for processes where the application of conventional models is complicated by the complexity and unknown conditions affecting the process or by changing environmental conditions.…”
Section: Data-driven Models For the Simulation Of Hydrological Processesmentioning
confidence: 99%
“…Because data points in ETT with drift were linearly inseparable, ACCESS generated a training model by utilizing the labeled ETT (Fig. 4(c)) as training data through an optimized neural network (NN) algorithm [46] instead of SVM. The training model was applied to this ETT for reclassification.…”
Section: Identification Of Transition States In Etts Using Ac-cessmentioning
confidence: 99%
“…Whereas the distance-based weighting takes the inverse of the distances between the input sample and its k nearest neighbors into account, the uniform assigns the same weight for each neighbor as the name suggests. C-SVC is a type of support vector machine (SVM) that can employ, for example, linear, polynomial, radial basis function, and sigmoid kernels [32]. Fitting an SVC is based on structural risk minimization and aims to find a hyperplane that maximizes the distance between the hyperplane and the data points of different classes that are closest to it [33].…”
Section: ) Other Classifiersmentioning
confidence: 99%
“…Fitting an SVC is based on structural risk minimization and aims to find a hyperplane that maximizes the distance between the hyperplane and the data points of different classes that are closest to it [33]. A kernel is a function that is used to compute how similar two input samples are [32].…”
Section: ) Other Classifiersmentioning
confidence: 99%