2002
DOI: 10.1117/12.454150
|View full text |Cite
|
Sign up to set email alerts
|

<title>Curvilinear component analysis for nonlinear dimensionality reduction of hyperspectral images</title>

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2005
2005
2018
2018

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 31 publications
(25 citation statements)
references
References 0 publications
0
25
0
Order By: Relevance
“…For example, curvilinear component analysis [26], curvilinear distance analysis [27], and manifold learning [28]- [31] are nonlinear projections based on the preservation of the local topology. Independent component analysis [32], [33], projection pursuit [34], [35], and wavelet decomposition [36], [37] have also been considered.…”
Section: A Unsupervised Hyperspectral Dimensionality Reduction Methodsmentioning
confidence: 99%
“…For example, curvilinear component analysis [26], curvilinear distance analysis [27], and manifold learning [28]- [31] are nonlinear projections based on the preservation of the local topology. Independent component analysis [32], [33], projection pursuit [34], [35], and wavelet decomposition [36], [37] have also been considered.…”
Section: A Unsupervised Hyperspectral Dimensionality Reduction Methodsmentioning
confidence: 99%
“…A variety of local methods exist for estimating manifolds. For example, curvilinear component analysis [100], curvilinear distance analysis [101], manifold learning [102]- [107] are non-linear projections based on the preservation of the local topology. Independent component analysis [108], [109], projection pursuit [110], [111], and wavelet decomposition [112], [113] have also been considered.…”
Section: Signal Subspace Identificationmentioning
confidence: 99%
“…Even though its theoretical limitations for hyperspectral data analysis have been pointed out (Landgrebe 2003;Lennon 2002), in a practical situation the results obtained using the PCA are still competitive for the purpose of classification (Journaux et al 2006;Lennon et al 2001). The advantages of the PCA are its low complexity and the absence of parameters.…”
Section: Feature Extraction and Selectionmentioning
confidence: 99%