1998
DOI: 10.1109/5326.661089
|View full text |Cite
|
Sign up to set email alerts
|

Supervised classification in high-dimensional space: geometrical, statistical, and asymptotical properties of multivariate data

Abstract: As the number of spectral bands of high spectral resolution data increases, the capability to detect more detailed classes should also increase, and the classification accuracy should increase as well. Often the number of labeled samples used for supervised classification techniques is limited, thus limiting the precision with which class characteristics can be estimated. As the number of spectral bands becomes large, the limitation on performance imposed by the limited number of training samples can become se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
197
0
1

Year Published

1999
1999
2009
2009

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 371 publications
(201 citation statements)
references
References 26 publications
(23 reference statements)
0
197
0
1
Order By: Relevance
“…Efficient algorithms are needed in order to extract the robust spatial and spatiotemporal features in the data identifying regions in the brain that best discriminate the classes. Jimenez and Landgrebe (1998) demonstrated that high dimensional space is mostly empty and pointed out that the useful information can be extracted more easily in a lower-dimensional subspace using Projection Pursuit (PP) algorithms (Jimenez and Landgrebe, 1999). PP is a proposed solution for problems where the classification is not accurate due to the limited number of training samples in a high dimensional space.…”
Section: Introductionmentioning
confidence: 99%
“…Efficient algorithms are needed in order to extract the robust spatial and spatiotemporal features in the data identifying regions in the brain that best discriminate the classes. Jimenez and Landgrebe (1998) demonstrated that high dimensional space is mostly empty and pointed out that the useful information can be extracted more easily in a lower-dimensional subspace using Projection Pursuit (PP) algorithms (Jimenez and Landgrebe, 1999). PP is a proposed solution for problems where the classification is not accurate due to the limited number of training samples in a high dimensional space.…”
Section: Introductionmentioning
confidence: 99%
“…The estimation of density functions may not be overly reliable for these cases since the dimensionality of the sample space and the number of samples per class are comparable. It has been reported that the number of samples in each class must be at least ten times the dimensionality of the sample space for a reliable density estimation [30], [31]. Thus, our proposed kernel subspace classifier may be a good choice in such cases.…”
Section: G Discussionmentioning
confidence: 99%
“…The problem of dimensionality reduction appears in many fields of artificial intelligence such as data mining, data compression and data visualization, moderating the curse of dimensionality and other undesired properties of high dimensional spaces [14]. Given a dataset X = [x 1 , x 2 , ... , x n ] ∈ R n×D consists of n datavectors x i with dimensionality D and has intrinsic dimension d (with d << D).…”
Section: Dimensionality Reductionmentioning
confidence: 99%