IEEE International IEEE International IEEE International Geoscience and Remote Sensing Symposium, 2004. IGARSS '04. Proceedings
DOI: 10.1109/igarss.2004.1368591
|View full text |Cite
|
Sign up to set email alerts
|

Random forest classification of multisource remote sensing and geographic data

Abstract: The use of random forests for classification of multisource data is investigated in this paper. Random Forest is a classifier that grows many classification trees. Each tree is trained on a bootstrapped sample of the training data, and at each node the algorithm only searches across a random subset of the variables to determine a split. To classify an input vector in random forest, the vector is submitted as an input to each of the trees in the forest, and the classification is then determined by a majority vo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
49
0
2

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 89 publications
(51 citation statements)
references
References 4 publications
0
49
0
2
Order By: Relevance
“…Classifiers have the same performance for classification of the joint dataset; SVM and RF have better performance compared to NN and MLC at 5% significance level. Parametric classification algorithms such as MLC are not typically suitable for multi source data [73,84]. The better performance of SVM and RF compared to NN could be because of the fact that both of these classifiers can handle high dimensional data [77,84].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Classifiers have the same performance for classification of the joint dataset; SVM and RF have better performance compared to NN and MLC at 5% significance level. Parametric classification algorithms such as MLC are not typically suitable for multi source data [73,84]. The better performance of SVM and RF compared to NN could be because of the fact that both of these classifiers can handle high dimensional data [77,84].…”
Section: Discussionmentioning
confidence: 99%
“…Despite limitations due to its assumption of normal distribution of class signature [69], it is perhaps one of the most widely used classifiers [70][71][72]. Non-parametric approaches are suggested for the classification of multi-source data in complex environments [73]. SVM is a supervised non-parametric statistical learning technique [74] and it follows what is known as structural risk minimization.…”
Section: Determination Of the Land Cover Classification Schemementioning
confidence: 99%
“…As illustrated previously, and Ntry are two sensitive parameters in the RF models. Ntry is the square root of the total number of variables [41]. influences the convergence of the RF and can be determined through the OOB error.…”
Section: Model Implementation and Validationmentioning
confidence: 99%
“…The assignment of class label of an unknown instance is performed using majority voting strategy. Due to the important advantages such as handling very large number of input attributes and low time cost, Random Forest has widely attracted the interests of researchers from the context of remote sensing image classification (Waske, B and Braun, M., 2009;Gislason et al, 2004;Qi et al, 2012;Samat et al, 2014).…”
Section: Training and Classification By Random Forestmentioning
confidence: 99%