2014
DOI: 10.1016/j.ymssp.2013.12.013
|View full text |Cite
|
Sign up to set email alerts
|

Structure damage detection based on random forest recursive feature elimination

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
40
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 76 publications
(40 citation statements)
references
References 17 publications
0
40
0
Order By: Relevance
“…It is built by bootstrap sampling technology and random splitting technology, and the final classification result is made by a majority vote of the trees [39,40]. Because of its excellent generalization performance, RF is also further used for feature selection [41,42].…”
Section: Feature Selection With Random Forestmentioning
confidence: 99%
“…It is built by bootstrap sampling technology and random splitting technology, and the final classification result is made by a majority vote of the trees [39,40]. Because of its excellent generalization performance, RF is also further used for feature selection [41,42].…”
Section: Feature Selection With Random Forestmentioning
confidence: 99%
“…Based on the RMSE values, one variable was removed, and a new RF model was created using the remaining variables. This process was recursively applied until only one variable remained as input [36]. During the process of elimination, 10-fold cross validation was implemented to optimize the variable selection and to ascertain the standard RF is a collection of several decision trees where each tree is constructed independently with random samples (n) from the training data.…”
Section: Introductionmentioning
confidence: 99%
“…To reduce the set of 27 candidate predictor variables, a “random forest recursive feature elimination” was applied. The algorithm starts from the complete set and then eliminates the least relevant variables one after the other based on a variable importance ranking produced by fitting random forests (Guyon, Weston, Barnhill, & Vapnik, ; Zhou, Zhou, Zhou, Yang, & Luo, ). A logistic regression model was then calibrated using the retained variables and the possible connecting segments from Section 3.3.2 that were classified based on field data.…”
Section: Methodsmentioning
confidence: 99%