2007
DOI: 10.1109/acssc.2007.4487406
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral Image Unmixing via Alternating Projected Subgradients

Abstract: Abstract-We formulate the problem of hyperspectral image unmixing as a nonconvex optimization problem, similar to nonnegative matrix factorization. We present a heuristic for approximately solving this problem using an alternating projected subgradient approach. Finally, we present the results of applying this method on the 1990 AVIRIS image of Cuprite, Nevada and show that our results are in agreement with similar studies on the same data.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
47
0
6

Year Published

2011
2011
2023
2023

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 49 publications
(53 citation statements)
references
References 15 publications
0
47
0
6
Order By: Relevance
“…Moreover, every limit point of {( , )} A S ( ) ( ) k k k is a stationary point of (42) under some fairly mild assumptions [75], [76]. For practical reasons, most algorithms use cheap but inexact updates for (46a) and (46b), e.g., multiplicative update [71], one-step projected gradient or subgradient update [60], [69], [72], and one-step majorization minimization [74]. Convergence to a stationary point of these inexact AO methods has still to be thoroughly analyzed.…”
Section: Nonnegative Matrix Factorizationmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, every limit point of {( , )} A S ( ) ( ) k k k is a stationary point of (42) under some fairly mild assumptions [75], [76]. For practical reasons, most algorithms use cheap but inexact updates for (46a) and (46b), e.g., multiplicative update [71], one-step projected gradient or subgradient update [60], [69], [72], and one-step majorization minimization [74]. Convergence to a stationary point of these inexact AO methods has still to be thoroughly analyzed.…”
Section: Nonnegative Matrix Factorizationmentioning
confidence: 99%
“…Following the same spirit, L / 1 2 -NMF [71] uses a nonconvex, but stronger sparsity-promoting regularizer based on the / 1 2 , quasinorm. Apart from sparsity, exploitation of spatial contextual information via TV regularization may also be used [72].…”
Section: Nonnegative Matrix Factorizationmentioning
confidence: 99%
“…2, this now works well because the user-defined step is fixed and small. A simple and fast algorithm for projecting onto the probability simplex 52 was utilized. It is also necessary to project xðrÞ onto its own constraint (xðrÞ 2 ½0; 1) after each step.…”
mentioning
confidence: 99%
“…At the level of underlying mathematics, a similar problem appears in material classification and recognition from hyperspectral images [5]. Some remarkable results on hyperspectral unmixing are the Alternating Projected Subgradients [6], the Non-negative Least-correlated Component Analysis (nLCA) [7], the Non-negative Matrix Factorization Minimum Volume Transform (NMF-MVT) [8], the Minimal Volume Simplex Analysis (MVSA) [9] and the Minimal Volume Enclosed Simplex (MVES) [10]. The MVSA algorithm is outstanding in terms of computational time and efficiency [10], [11].…”
Section: Introductionmentioning
confidence: 99%