2016
DOI: 10.1109/tsp.2015.2483480
|View full text |Cite
|
Sign up to set email alerts
|

Analysis SimCO Algorithms for Sparse Analysis Model Based Dictionary Learning

Abstract: Abstract-In this paper, we consider the dictionary learning problem for the sparse analysis model. A novel algorithm is proposed by adapting the simultaneous codeword optimization (SimCO) algorithm, based on the sparse synthesis model, to the sparse analysis model. This algorithm assumes that the analysis dictionary contains unit ℓ2-norm atoms and learns the dictionary by optimization on manifolds. This framework allows multiple dictionary atoms to be updated simultaneously in each iteration. However, similar … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
35
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 39 publications
(35 citation statements)
references
References 29 publications
0
35
0
Order By: Relevance
“…It shows that the sequence of values of the target function v k = f (Γ (k) ) converges. This, however, does not imply convergence of the algorithm as suggested in [3], at least not in the sense that the sequence Γ (k) converges. Indeed the sequence Γ (k) could orbit around the set L = {Γ ∈ A : f (Γ) = v}, where v = lim k→∞ v k .…”
Section: Table 1 the Faol Algorithmmentioning
confidence: 94%
See 3 more Smart Citations
“…It shows that the sequence of values of the target function v k = f (Γ (k) ) converges. This, however, does not imply convergence of the algorithm as suggested in [3], at least not in the sense that the sequence Γ (k) converges. Indeed the sequence Γ (k) could orbit around the set L = {Γ ∈ A : f (Γ) = v}, where v = lim k→∞ v k .…”
Section: Table 1 the Faol Algorithmmentioning
confidence: 94%
“…Apart from additional side constraints on Γ, such as incoherence, the optimisation program above has already been used successfully as starting point for the development of two analysis operator learning algorithms, Analysis K-SVD [15] and Analysis SimCO [2], [3]. AKSVD is an alternating minimisation algorithm, which alternates between finding the best X ∈ X for the current Γ and updating Γ based on the current X.…”
Section: Two Explicit Analysis Operator Learning Algorithms -Faol Andmentioning
confidence: 99%
See 2 more Smart Citations
“…Zhang et al [21] use Recursive Least Square method to accelerate dictionary learning by updating a dictionary based on the dictionaries in the previous iterations. Li et al [22] and Dong et al [23] proposed Simultaneous codeword optimization (SimCo) related algorithms to update multiple atoms in each iteration and by adding an extra incoherent regularity term to avoid linear dependency among dictionary atoms. On other hand, Li et al [24], [25] used l 1/2 norm instead of l 1 norm to have stronger sparsity and mathematically guaranteed a strong convergence.…”
Section: Introductionmentioning
confidence: 99%