2004
DOI: 10.1109/tgrs.2004.827262
|View full text |Cite
|
Sign up to set email alerts
|

Robust support vector method for hyperspectral data classification and knowledge discovery

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
104
0

Year Published

2005
2005
2020
2020

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 241 publications
(104 citation statements)
references
References 35 publications
0
104
0
Order By: Relevance
“…Support vector machines (SVMs) are nonparametric discriminative classifiers which typically perform well in classification of hyperspectral data and do not require reduction in dimensionality [35]. A radial basis function (RBF) kernel-based SVM [34] was used as the base learner.…”
Section: B Experimental Designmentioning
confidence: 99%
See 1 more Smart Citation
“…Support vector machines (SVMs) are nonparametric discriminative classifiers which typically perform well in classification of hyperspectral data and do not require reduction in dimensionality [35]. A radial basis function (RBF) kernel-based SVM [34] was used as the base learner.…”
Section: B Experimental Designmentioning
confidence: 99%
“…Further, in order to concentrate on the most informative samples within this pool, we explore sample consistency by the clustering assumption [20], [21] Querying those samples could help refine the classification hyperplane, especially for discriminative classifiers such as SVM [35], whose performance heavily relies on the quality of the support vectors around the decision boundary. Also, by avoiding queries of samples from high-density regions where labels are more likely to be consistent, we can avoid inclusion of non-informative/redundant samples into the training pool, thereby reducing the overall sample size required to train a good learner.…”
mentioning
confidence: 99%
“…SVM has demonstrated its robustness to outliers and is an excellent classifier when the number of input features is high [12]. The original binary version of SVM aims to find the optimal plane that separates the available data into two classes by maximizing the distance (margins) between the so-called support vectors (i.e., the closest training samples to the optimal hyperplane).…”
Section: Vegetation Index (Vi) Formulamentioning
confidence: 99%
“…Remote sensing image classification is a convenient approach for producing these maps due to advantages in terms of cost, revisit time, and spatial coverage [10]. Indeed, remotely sensed image classification has been successfully applied to produce crop maps in homogeneous areas [11][12][13][14].…”
Section: Introductionmentioning
confidence: 99%
“…With the successful application of SVMs to a wide variety of different pattern recognition areas, such as 3D object recognition, image classification, character recognition, etc, SVMs have recently attracted increasing attention in remote-sensed multi/hyper-spectral communities [7,8,9,10,11,12,13,14]. Previous literature applying SVMs to hyperspectral data classification [10,11,12] has shown competitive performance with the best available classification algorithms.…”
Section: Introductionmentioning
confidence: 99%