2011
DOI: 10.1007/978-3-642-23783-6_29
|View full text |Cite
|
Sign up to set email alerts
|

On Oblique Random Forests

Abstract: Abstract. In his original paper on random forests, Breiman proposed two different decision tree ensembles: one generated from "orthogonal" trees with thresholds on individual features in every split, and one from "oblique" trees separating the feature space by randomly oriented hyperplanes. In spite of a rising interest in the random forest framework, however, ensembles built from orthogonal trees (RF) have gained most, if not all, attention so far.In the present work we propose to employ "oblique" random fore… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

7
120
0
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 156 publications
(131 citation statements)
references
References 35 publications
7
120
0
1
Order By: Relevance
“…RF typically refers to what Breiman called Forest-RIan ensemble of axis-parallel, or orthogonal decision trees [11,14]. In these types of trees, the feature space is recursively split along directions parallel to the axes of the feature space.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…RF typically refers to what Breiman called Forest-RIan ensemble of axis-parallel, or orthogonal decision trees [11,14]. In these types of trees, the feature space is recursively split along directions parallel to the axes of the feature space.…”
Section: Introductionmentioning
confidence: 99%
“…Thus, in cases in which the classes seem inseparable along any single dimension, RF may be suboptimal. To address this drawback, several variations of "oblique" decision forests have been proposed, including efforts to learn good projections [11,16], using principal components analysis to find the directions of maximal variance [12,15], or directly learning good discriminant directions [14]. Another recently proposed method, called Random Rotation RF (RR-RF), uniformly randomly rotates the data for every decision tree in the ensemble prior to inducing the tree [2].…”
Section: Introductionmentioning
confidence: 99%
“…While random forests are decision tree ensembles generated from orthogonal trees, oblique random forests are built from trees that split using linear discriminative models, such as lda, ridge regression and logistic regression. 68 The success of these models confirms the importance of including non-linear models in the prediction framework.…”
Section: Network-expression-mutation Signature Viewmentioning
confidence: 69%
“…In the context of high dimensional response (output) space, [15] is yet another interesting adaptation of random forest aimed at attaining every greater predictive performances. [1] and [17] have also recently proposed very interesting extension on the RF theme that seeks to further lower the prediction error. Coming from a perspective similar to ours, even though their base learners are still trees whereas we allow any type of base learner, [1] enriches the RF ensemble by way of a random subspace selection that gives less weight to weak features, i.e.…”
Section: Problem Formulationmentioning
confidence: 99%