Abstract:Abstract-For blind deconvolution of an unknown sparse sequence convolved with an unknown pulse, a powerful Bayesian method employs the Gibbs sampler in combination with a Bernoulli-Gaussian prior modeling sparsity. In this paper, we extend this method by introducing a minimum distance constraint for the pulses in the sequence. This is physically relevant in applications including layer detection, medical imaging, seismology, and multipath parameter estimation. We propose a Bayesian method for blind deconvoluti… Show more
“…Although this strategy can significantly decrease the complexity of the sampling process, it must be implemented with care to guarantee that the desired stationary distribution is preserved. Applications of PCGS algorithms can be found in [66][67][68].…”
Section: Alternative Ii: Eliminate the Coupling Induced By H D(σ (T) )Hmentioning
Abstract:In this paper, we are interested in Bayesian inverse problems where either the data fidelity term or the prior distribution is Gaussian or driven from a hierarchical Gaussian model. Generally, Markov chain Monte Carlo (MCMC) algorithms allow us to generate sets of samples that are employed to infer some relevant parameters of the underlying distributions. However, when the parameter space is high-dimensional, the performance of stochastic sampling algorithms is very sensitive to existing dependencies between parameters. In particular, this problem arises when one aims to sample from a high-dimensional Gaussian distribution whose covariance matrix does not present a simple structure. Another challenge is the design of Metropolis-Hastings proposals that make use of information about the local geometry of the target density in order to speed up the convergence and improve mixing properties in the parameter space, while not being too computationally expensive. These two contexts are mainly related to the presence of two heterogeneous sources of dependencies stemming either from the prior or the likelihood in the sense that the related covariance matrices cannot be diagonalized in the same basis. In this work, we address these two issues. Our contribution consists of adding auxiliary variables to the model in order to dissociate the two sources of dependencies. In the new augmented space, only one source of correlation remains directly related to the target parameters, the other sources of correlations being captured by the auxiliary variables. Experiments are conducted on two practical image restoration problems-namely the recovery of multichannel blurred images embedded in Gaussian noise and the recovery of signal corrupted by a mixed Gaussian noise. Experimental results indicate that adding the proposed auxiliary variables makes the sampling problem simpler since the new conditional distribution no longer contains highly heterogeneous correlations. Thus, the computational cost of each iteration of the Gibbs sampler is significantly reduced while ensuring good mixing properties.
“…Although this strategy can significantly decrease the complexity of the sampling process, it must be implemented with care to guarantee that the desired stationary distribution is preserved. Applications of PCGS algorithms can be found in [66][67][68].…”
Section: Alternative Ii: Eliminate the Coupling Induced By H D(σ (T) )Hmentioning
Abstract:In this paper, we are interested in Bayesian inverse problems where either the data fidelity term or the prior distribution is Gaussian or driven from a hierarchical Gaussian model. Generally, Markov chain Monte Carlo (MCMC) algorithms allow us to generate sets of samples that are employed to infer some relevant parameters of the underlying distributions. However, when the parameter space is high-dimensional, the performance of stochastic sampling algorithms is very sensitive to existing dependencies between parameters. In particular, this problem arises when one aims to sample from a high-dimensional Gaussian distribution whose covariance matrix does not present a simple structure. Another challenge is the design of Metropolis-Hastings proposals that make use of information about the local geometry of the target density in order to speed up the convergence and improve mixing properties in the parameter space, while not being too computationally expensive. These two contexts are mainly related to the presence of two heterogeneous sources of dependencies stemming either from the prior or the likelihood in the sense that the related covariance matrices cannot be diagonalized in the same basis. In this work, we address these two issues. Our contribution consists of adding auxiliary variables to the model in order to dissociate the two sources of dependencies. In the new augmented space, only one source of correlation remains directly related to the target parameters, the other sources of correlations being captured by the auxiliary variables. Experiments are conducted on two practical image restoration problems-namely the recovery of multichannel blurred images embedded in Gaussian noise and the recovery of signal corrupted by a mixed Gaussian noise. Experimental results indicate that adding the proposed auxiliary variables makes the sampling problem simpler since the new conditional distribution no longer contains highly heterogeneous correlations. Thus, the computational cost of each iteration of the Gibbs sampler is significantly reduced while ensuring good mixing properties.
“…1). Similar to the blind deconvolution problem in [17,18], the T wave is modeled by the convolution of an unknown binary "indicator sequence" b T;n ¼ ðb T;n;1 … b T;n;N T;n Þ T indicating the wave locations (b T;n;k ¼ 1 if there is a wave at the kth possible location, b T;n;k ¼ 0 otherwise) with an unknown T waveform h T;n ¼ ðh T;n; À L … h T;n;L Þ T . Analogous definitions for the P wave yield b P;n ¼ ðb P;n;1 … b P;n;N P;n Þ T and…”
, et al.. Sequential beat-to-beat P and T wave delineation and waveform estimation in ECG signals: Block Gibbs sampler and marginalized particle filter. Signal Processing, Elsevier, 2014Elsevier, , vol. 104, pp. 174-187. <10.1016Elsevier, /j.sigpro.2014 For ECG interpretation, the detection and delineation of P and T waves are challenging tasks. This paper proposes sequential Bayesian methods for simultaneous detection, threshold-free delineation, and waveform estimation of P and T waves on a beat-to-beat basis. By contrast to state-of-the-art methods that process multiple-beat signal blocks, the proposed Bayesian methods account for beat-to-beat waveform variations by sequentially estimating the waveforms for each beat. Our methods are based on Bayesian signal models that take into account previous beats as prior information. To estimate the unknown parameters of these Bayesian models, we first propose a block Gibbs sampler that exhibits fast convergence in spite of the strong local dependencies in the ECG signal. Then, in order to take into account all the information contained in the past rather than considering only one previous beat, a sequential Monte Carlo method is presented, with a marginalized particle filter that efficiently estimates the unknown parameters of the dynamic model. Both methods are evaluated on the annotated QT database and observed to achieve significant improvements in detection rate and delineation accuracy compared to state-of-the-art methods, thus providing promising approaches for sequential P and T wave analysis.
“…The proposed BD method overcomes certain weaknesses of the traditional SMLR-based BD method (Mendel, 1983), which is verified experimentally to result in improved detection/estimation performance and reduced computational complexity. Our simulation results also demonstrate performance and complexity advantages relative to the iterated window maximization (IWM) algorithm (Kaaresen, 1997) and a recently proposed partially collapsed Gibbs sampler method (Kail et al, 2012).
…”
mentioning
confidence: 70%
“…Because the result of BD is inherently nonunique, additional assumptions or constraints-such as monotonicity [16], positivity [17]- [19], and sparsity [20]- [23]-are typically used. In this paper, we study BD under a combined sparsity and minimum distance constraint as introduced recently in [24].…”
We consider Bayesian blind deconvolution (BD) of an unknown sparse sequence convolved with an unknown pulse. Our goal is to detect the positions where the sparse input sequence is nonzero and to estimate the corresponding amplitudes as well as the pulse shape. For this task, we propose a novel evolution of the single most likely replacement (SMLR) algorithm. Our method uses a modified Bernoulli-Gaussian prior that incorporates a minimum temporal distance constraint. This prior simultaneously induces sparsity and enforces a prescribed minimum distance between the pulse centers. The minimum distance constraint provides an effective way to avoid overfitting (i.e., spurious detected pulses) and improve resolution. The proposed BD method overcomes certain weaknesses of the traditional SMLR-based BD method (Mendel, 1983), which is verified experimentally to result in improved detection/estimation performance and reduced computational complexity. Our simulation results also demonstrate performance and complexity advantages relative to the iterated window maximization (IWM) algorithm (Kaaresen, 1997) and a recently proposed partially collapsed Gibbs sampler method (Kail et al., 2012).
Index TermsBayesian blind deconvolution, sparse deconvolution, single most likely replacement (SMLR) algorithm, Bernoulli-Gaussian prior, iterated window maximization (IWM) algorithm.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.