1999
DOI: 10.1117/12.373270
|View full text |Cite
|
Sign up to set email alerts
|

<title>Integration of spatial and spectral information in unsupervised classification for multispectral and hyperspectral data</title>

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2002
2002
2021
2021

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…This type of processing can be approached from various points of view. In [30], a possibility is discussed for the refinement of results obtained by spectral techniques through a second step based on spatial context. The techniques proposed by these authors do not focus on endmember extraction, but yield some insightful feedback on plausible spatial and spectral response combinations.…”
Section: Background On Endmember Extraction Techniquesmentioning
confidence: 99%
“…This type of processing can be approached from various points of view. In [30], a possibility is discussed for the refinement of results obtained by spectral techniques through a second step based on spatial context. The techniques proposed by these authors do not focus on endmember extraction, but yield some insightful feedback on plausible spatial and spectral response combinations.…”
Section: Background On Endmember Extraction Techniquesmentioning
confidence: 99%
“…In recent years, non-parametric machine learning classifiers were also developed to improve classification accuracy, such as the classification and regression tree (CART) [14][15][16], support vector machine (SVM) [17][18][19], and random forest (RF) [20,21]. These classifiers are often used for hyperspectral data with more spectral information provided in each pixel [22,23]. However, the increase in within-class variance and decrease in between-class variance often limit the accuracy of pixel-based methods [24].…”
Section: Introductionmentioning
confidence: 99%