2016
DOI: 10.1080/07038992.2016.1160772
|View full text |Cite
|
Sign up to set email alerts
|

Minimum Noise Fraction versus Principal Component Analysis as a Preprocessing Step for Hyperspectral Imagery Denoising

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
65
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 154 publications
(66 citation statements)
references
References 27 publications
0
65
0
1
Order By: Relevance
“…MNF is a linear transform consisting of two different steps: (1) computation of the covariance matrix to decorrelate and rescale the noise in the data; and (2) the performance of a standard PCA transform of the decorrelated and rescaled noise data. The goal of the MNF transform is to select components such they maximize the Signal-to-Noise Ratio (SNR), which compares the level of the signal to the level of the background noise rather than the information content [13]. The fact of ordering the components according to the amount of information results in a more reliable identification and elimination of noisy components, and allows for the preservation of components that contain useful information [9,13].…”
Section: Dimensionality Reduction Techniquesmentioning
confidence: 99%
See 2 more Smart Citations
“…MNF is a linear transform consisting of two different steps: (1) computation of the covariance matrix to decorrelate and rescale the noise in the data; and (2) the performance of a standard PCA transform of the decorrelated and rescaled noise data. The goal of the MNF transform is to select components such they maximize the Signal-to-Noise Ratio (SNR), which compares the level of the signal to the level of the background noise rather than the information content [13]. The fact of ordering the components according to the amount of information results in a more reliable identification and elimination of noisy components, and allows for the preservation of components that contain useful information [9,13].…”
Section: Dimensionality Reduction Techniquesmentioning
confidence: 99%
“…It is a mathematical orthogonal transformation that changes a set of observations of possibly correlated variables into a set of uncorrelated variables called principal components [31]. PCA retains most of the information of the original data in a low-dimensional space [13]. Conventional PCA faces three main challenges: (1) obtaining a covariance matrix in an extremely large spatial dimension; (2) dealing with the high computational cost required for the analysis of a large dataset; and (3) retaining locally structured elements that appear in a small number of bands for improved discriminant ability when feature bands are globally extracted as principal components [1].…”
Section: Dimensionality Reduction Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…where λ 1 is the regularization parameter that is used to weigh the impact of the TV norm on the minimum of the objective function. In this paper, the reconstruction algorithm using (7) as the optimization function is denoted as BCS_TV, in which the image is sampled by the BCS method and reconstructed using the TV norm. The way to solve the optimization problem is named TVAL3 [36].…”
Section: Optimization Problem Using Tv Norm Differentmentioning
confidence: 99%
“…Just like PCA, MNF also transforms original data into a new dimensional space whose eigenvectors are orthogonal to other eigenvectors. MNF has been used by many hyperspectral and multispectral studies to reduce the data dimensions [17,43,[45][46][47]. ENVI 5.2 was used to perform PCA and MNF transformations.…”
Section: Methodsmentioning
confidence: 99%