2012
DOI: 10.1016/j.sigpro.2011.12.015
|View full text |Cite
|
Sign up to set email alerts
|

Multiplicative noise removal via sparse and redundant representations over learned dictionaries and total variation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 35 publications
(14 citation statements)
references
References 30 publications
0
14
0
Order By: Relevance
“…For each image, a noisy observation is generated by multiplying the original image by a realization of noise according to Eqs. (1) and (2) with the choice L ∈ (3,5,7,9). Peak Signal to Noise Ratio (PSNR) is used to measure the qualities of the restored images, which is defined as follows:…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For each image, a noisy observation is generated by multiplying the original image by a realization of noise according to Eqs. (1) and (2) with the choice L ∈ (3,5,7,9). Peak Signal to Noise Ratio (PSNR) is used to measure the qualities of the restored images, which is defined as follows:…”
Section: Resultsmentioning
confidence: 99%
“…Popular methods include the Lee method [1], the multiscale shrinkage methods [2], various anisotropic diffusion based methods [3,4], and variational methods [5][6][7][8][9][10][11][12][13][14][15]. The first TV based multiplicative noise removal model was presented by Rudin et al [7], which used a constrained optimization approach with two Lagrange multipliers.…”
Section: Introductionmentioning
confidence: 99%
“…The reasons why we choose these methods as comparisons are that (a) these methods not only report good denoising results but also have some theoretical results about their solutions. (b) Although some other state-ofthe-art methods, such as the curvelet-based methods [13] and dictionary learning-based method [18], can generate better denoising results, whether the minimization problems in these methods have solutions are still open topics because some hybrid criterions are used in these methods. We implement all related algorithms in the Matlab 7.0 environment on a computer equipped with dual cores and 1.90 GHz AMD.…”
Section: Implementation Details and Experimental Resultsmentioning
confidence: 99%
“…The parameter ε reflects the influence of different regularizers in model (10). Through a lot of experiments, the best parameter μ can be chosen from the fixed range [1,18], and the parameter ε can be chosen from the fixed range [0, 0.6]. The parameter δ in Equation (35) is the time-step of the fixed point iteration method.…”
Section: Criterions Of Choosing Parametersmentioning
confidence: 99%
“…This is due to the fact that natural images contain repeated patterns, such as flat region and texture region. Therefore, they can be well approximated as linear combinations of only a few atoms from a dictionary [18][19][20][21]. Dictionary learning in image denoising was proposed by Elad and Aharon.…”
Section: Introductionmentioning
confidence: 99%