2016
DOI: 10.1016/j.compstruc.2016.05.003
|View full text |Cite
|
Sign up to set email alerts
|

A minimum-of-maximum relative error support vector machine for simultaneous reverse prediction of concrete components

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(9 citation statements)
references
References 35 publications
0
9
0
Order By: Relevance
“…The optimization of accuracy and MSE can be improved by integrating an evolutionary algorithm to ensure its implementation on other data sets. In Reference 27 , despite high precision in classification, the method is time‐consuming and suitable for a small set of data. Most PSO algorithms, in combination with techniques such as SVM, can be effective, provided that they are not caught in local optimizations.…”
Section: Analysis Of Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The optimization of accuracy and MSE can be improved by integrating an evolutionary algorithm to ensure its implementation on other data sets. In Reference 27 , despite high precision in classification, the method is time‐consuming and suitable for a small set of data. Most PSO algorithms, in combination with techniques such as SVM, can be effective, provided that they are not caught in local optimizations.…”
Section: Analysis Of Resultsmentioning
confidence: 99%
“…This error could be reduced by optimization methods, while in the methods, they have received no attention. The lack of repetition and failure to solve the uncertainty problem in 27 also show that, since the SVM decision maker is regression, the PSO is applied to improve the results, but when the fitness function is a PSO function, it is a regression classifier. Since cross‐validation is performed as K‐fold and with repetition, for big data, the correct answer is not received in a short time.…”
Section: Analysis Of Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this research, the relative error R e and Nash-Suttcliffe efficiency coefficient Ens are adopted to evaluate the fitting effects between the simulated results and the measured values [56,57]. Computation methods of the relative error R e and the efficiency coefficient Ens are, respectively, as shown in Formulas (11) and (12).…”
Section: Analysis Of Simulation Effectsmentioning
confidence: 99%
“…The dataset comprised 720 experimental tests. The following seven algorithms were implemented for each material [49]: Multi-Linear Regression (MLR), K Nearest Neighbours (KNN), Regression Tree (RT), Random Forest (RF), Gradient Boosting (GB), Multi-Layer Perceptron (MLP) [50] and Support Vector Machine (SVM) [51,52]. 80% of the observations are randomly chosen to form the training dataset.…”
Section: Introductionmentioning
confidence: 99%