2022
DOI: 10.1109/tci.2022.3230584
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Higher-Order Independent Component Analysis for Hyperspectral Dimensionality Reduction

Abstract: Hyperspectral imaging is a remote sensing technique that measures the spectrum of each pixel in the image of a scene. It can be used to detect objects or classify materials based on their optical reflectance spectra. Various methods have been developed to reduce the spectral dimension of hyperspectral images in order to facilitate their analysis. Independent Component Analysis (ICA) is a class of algorithms which extract statistically independent features. FastICA, is one of the most used ICA algorithms becaus… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(17 citation statements)
references
References 34 publications
(72 reference statements)
0
14
0
Order By: Relevance
“…. ., α k ), associated with the matrix K, can be determined using equation (9). Subsequently, these eigenvalues λK ( λ1 , λ2 , λ3 , .…”
Section: Gaussian-kw-npementioning
confidence: 99%
See 1 more Smart Citation
“…. ., α k ), associated with the matrix K, can be determined using equation (9). Subsequently, these eigenvalues λK ( λ1 , λ2 , λ3 , .…”
Section: Gaussian-kw-npementioning
confidence: 99%
“…This heightened interest can be attributed to its lucid statistical interpretation, straightforward model training, and superior efficacy in addressing issues associated with small sample sizes. Established MSPM fault detection methods, including partial least squares (PLS), principal component analysis (PCA), and independent component analysis (ICA), have achieved widespread adoption [6][7][8][9]. However, with the proliferation of variables in industrial monitoring, data is progressively transitioning into high dimensions.…”
Section: Introductionmentioning
confidence: 99%
“…Unsupervised feature extraction method is to realize the representation of original data without any prior knowledge. Typical unsupervised feature extraction methods include principal component analysis (PCA) [9], independent component analysis (ICA) [10] and discrete wavelet transform (DWT) [11]. The supervised feature extraction method uses marked samples to infer the separability of categories, and the most representative method is linear discriminant analysis (LDA) [12].…”
Section: Introductionmentioning
confidence: 99%
“…However, the problems of low spatial resolution, high spectral dimensionality and lack of labelled samples in HSI pose great challenges to the classification task [10][11][12]. In the early days, researchers proposed a series of feature extraction methods such as principal component analysis [13,14], independent component analysis [15][16][17], and linear discriminant analysis [17,18], and combined them with machine learning classifiers such as support vector machines [19,20], random forests [21,22], and Gaussian mixture model [23,24] to classify the HSI. These methods can effectively alleviate the Hughes phenomenon [25] that classification accuracy decreases with increasing spectral dimension, but because only spectral features are considered and based on manual design, the classification accuracy and applicability are not ideal.…”
Section: Introductionmentioning
confidence: 99%