2011
DOI: 10.1109/tip.2011.2121083
|View full text |Cite
|
Sign up to set email alerts
|

Nonlocal Means With Dimensionality Reduction and SURE-Based Parameter Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
52
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 97 publications
(52 citation statements)
references
References 31 publications
(31 reference statements)
0
52
0
Order By: Relevance
“…(2010) improved this approach by grouping similar patches before PCA decomposition and iterated the process to obtain a higher noise reduction. PCA has been also used to robustly compute patch similarities within a Non-local means framework (Van de Ville and Kocher, 2010;Zhang et al, 2013;Zhang et al, 2014).…”
Section: Introductionmentioning
confidence: 99%
“…(2010) improved this approach by grouping similar patches before PCA decomposition and iterated the process to obtain a higher noise reduction. PCA has been also used to robustly compute patch similarities within a Non-local means framework (Van de Ville and Kocher, 2010;Zhang et al, 2013;Zhang et al, 2014).…”
Section: Introductionmentioning
confidence: 99%
“…Alternatively, we may use Stein's unbiased risk estimator, which is widely used as an efficient parameter selection method [17][18][19] , to derive the adaptive parameter. In addition, we could reduce the computational burden by optimizing the whole process further, particularly the available speed-up methods for computing nonlocal weights.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…The proposed BSS method put additional .3 to 1.1 dB on the best PSNR scores and 2% to 8% on the best SSIM score of using the standard NLM CPW. It is worthwhile to point out these gains on the best standard NLM scores are not trivial, and to some extend these BSS scores with simple shrinkage estimations are comparable to or better than more complicated NLM variants, for examples the linear expansion with six NLMs (see Table II in [12]), and the multi-patch NLMs (see Table 5 in [1]), both of which requires multi-rounds of NLM denoising. Fig.…”
Section: Methodsmentioning
confidence: 98%