2010 IEEE 7th International Conference on E-Business Engineering 2010
DOI: 10.1109/icebe.2010.99
|View full text |Cite
|
Sign up to set email alerts
|

Trees Weighting Random Forest Method for Classifying High-Dimensional Noisy Data

Abstract: Random forest is an excellent ensemble learning method, which is composed of multiple decision trees grown on random input samples and splitting nodes on a random subset of features. Due to its good classification and generalization ability, random forest has achieved success in various domains. However, random forest will generate many noisy trees when it learns from the data set that has high dimension with many noise features. These noisy trees will affect the classification accuracy, and even make a wrong … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
32
0
2

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 46 publications
(34 citation statements)
references
References 11 publications
0
32
0
2
Order By: Relevance
“…Moreover, numerous researchers studied weighted trees in the random forest again using the OOB as a measure of tree importance. 24,25,26 Ronao and Cho 25 and Winham et al 26 were able to increase the prediction accuracy of their desired classification problem; however, they did so in an ad-hoc nature. Namely, they were only attempting to answer a specific question.…”
Section: Weighted Averagesmentioning
confidence: 99%
See 2 more Smart Citations
“…Moreover, numerous researchers studied weighted trees in the random forest again using the OOB as a measure of tree importance. 24,25,26 Ronao and Cho 25 and Winham et al 26 were able to increase the prediction accuracy of their desired classification problem; however, they did so in an ad-hoc nature. Namely, they were only attempting to answer a specific question.…”
Section: Weighted Averagesmentioning
confidence: 99%
“…Namely, they were only attempting to answer a specific question. On the other hand, El Habib Daho et al and Li et al developed the theory of their weighted schemes but were unable to determine specific factors where their weighted scheme worked best.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This method uses random feature selection in the tree induction process and finally aggregates the predictions of the ensemble to make the final decision. When a new object is to be classified from an input vector, it passes the sample vector to each of the trees in the forest so that each tree can provide a classification decision, and then results of individual trees are combined to choose the classification [30].…”
Section: State-of-the-art Prediction Strategiesmentioning
confidence: 99%
“…Some random forest algorithms assign weights to classes [13]. There are algorithms with weights of decision trees [37,42,54]. However, to the best of our knowledge, the weighting schemes have not been used in RSFs.…”
Section: Introductionmentioning
confidence: 99%