2005
DOI: 10.1080/01431160512331314083
|View full text |Cite
|
Sign up to set email alerts
|

Support vector machines for classification in remote sensing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

9
361
0
11

Year Published

2008
2008
2017
2017

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 858 publications
(423 citation statements)
references
References 4 publications
9
361
0
11
Order By: Relevance
“…Figure 3 illustrates a simple scenario of a two-class separable classification problem in a two-dimensional input space [30]. SVM represent a noticeable development in machine learning research [31], particularly appealing in the remote sensing field due to is ability to generalize well, even with limited training samples, which is a common limitation for remote sensing applications [30]. It can achieve higher classification accuracy than Maximum Likelihood and Artificial Neural Network [29] with an overall accuracy above 90% [32].…”
Section: Lulc Maps Production and Lulcc Detectionmentioning
confidence: 99%
“…Figure 3 illustrates a simple scenario of a two-class separable classification problem in a two-dimensional input space [30]. SVM represent a noticeable development in machine learning research [31], particularly appealing in the remote sensing field due to is ability to generalize well, even with limited training samples, which is a common limitation for remote sensing applications [30]. It can achieve higher classification accuracy than Maximum Likelihood and Artificial Neural Network [29] with an overall accuracy above 90% [32].…”
Section: Lulc Maps Production and Lulcc Detectionmentioning
confidence: 99%
“…A report by Huang et al [2002] concluded that the accuracy improvement was insignificant if γ exceeded 7.5. Using a combined Radial Basis Function (RBF) kernel and C = 5000, Pal and Mather [2005] suggested γ = 2 to gain the best accuracy. In another case study, Foody and Mathur [2004] showed that optimal γ was in range between 0.005 and 0.08.…”
Section: Introductionmentioning
confidence: 99%
“…In feature extraction, a decision plane is a plane that separates between a set of objects having different class memberships. Points closest to the margin are called support vectors and are used for the training phase of classification (Huang & Townshend, 2002;Mountrakis & Ogole, 2011;Pal & Mather, 2005). …”
Section: Figure 1 General Flowchart Of Lidar Data Classificationmentioning
confidence: 99%