2012 IEEE International Workshop on Machine Learning for Signal Processing 2012
DOI: 10.1109/mlsp.2012.6349757
|View full text |Cite
|
Sign up to set email alerts
|

A Kullback-Leibler divergence approach for wavelet-based blind image deconvolution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2013
2013
2014
2014

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…The cost function considered in [27] is the Kullback-Leibler (KL) divergence [77], which is a measure of the difference between two probability distributions. Such a cost function arises in blind deconvolution when variational approach is used for estimating the PSF and the image [6,92,102,106,142,143,180]. In regularization or MAP based joint estimation the cost function is different as observed in the Chaps.…”
Section: Spatial Domain Convergence Analysismentioning
confidence: 99%
“…The cost function considered in [27] is the Kullback-Leibler (KL) divergence [77], which is a measure of the difference between two probability distributions. Such a cost function arises in blind deconvolution when variational approach is used for estimating the PSF and the image [6,92,102,106,142,143,180]. In regularization or MAP based joint estimation the cost function is different as observed in the Chaps.…”
Section: Spatial Domain Convergence Analysismentioning
confidence: 99%
“…To cope with this problem some prior constraints on B or x must be employed [2,4,5]. In literature BID is mostly addressed using deconvolution based methods, where B is de-convolved with y to get a sharp version x [6]. These methods can be categorized into two groups [1]; i) disjoint blur identification and image restoration; ii) estimate blur and sharp image simultaneously in one procedure.…”
Section: Introductionmentioning
confidence: 99%
“…It is quite evident now that wavelet subband coefficients of natural images have highly non-Gaussian statistics [14], represent heavy tailed distributions. The usefulness of heavy-tailed priors for natural images are already exercised in image denoising [13,15], motion deblurring [12], blind deconvolution [1,16], image restoration [17,18,19] and video matting [20]. Some example of this priors class includes the generalized Gaussian [21], the Jeffrey's noninformative prior [15], and the Gaussian mixture (GM) [17].…”
Section: Introductionmentioning
confidence: 99%