Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of a-posteriori estimators which are often expressed as complicated multidimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a thorough review of MCMC methods using multiple candidates in order to select the next state of the chain, at each iteration. With respect to the classical Metropolis-Hastings method, the use of multiple try techniques foster the exploration of the sample space. We present different Multiple Try Metropolis schemes, Ensemble MCMC methods, Particle Metropolis-Hastings algorithms and the Delayed Rejection Metropolis technique.We highlight limitations, benefits, connections and differences among the different methods, and compare them by numerical simulations. 45 have become very popular in the signal processing community. For instance, this is the case of the Particle Metropolis Hastings (PMH) and the Particle Marginal Metropolis Hastings (PMMH) algorithms, which have been widely used in signal processing in order to make inference and smoothing about dynamical and static parameters in state space models [26,27]. PMH can be interpreted as a MTM scheme where the different candidates are generated and weighted 50 by the use of a particle filter [28,29]. In this work, we present PMH and PMMH and discuss 1 MTM includes OBMC as a special case (see Section 4.1.1).2 their connections and differences with the classical MTM approach. Furthermore, we describe a suitable procedure for recycling some candidates in the final Monte Carlo estimators, called Group Metropolis Sampling (GMS) [29, 30]. The GMS scheme can be also seen as a way of generating a chain of sets of weighted samples. Finally, note that other similar and related techniques can be 55 found within the so-called data augmentation approach [31,32].The remaining of the paper is organized as follows. Section 2 recalls the problem statement and some background material, introducing also the required notation. The basis of MCMC and the Metropolis-Hastings (MH) algorithm are presented in Section 3. Section 4 is the core of the work, which describes the different MCMC using multiple candidates. Section 6 provides 60 some numerical results, applying different techniques in a hyperparameter tuning problem for a Gaussian Process regression model, and in a localization problem considering a wireless sensor network. Some conclusions are given in Section 7.
Problem statement and preliminariesIn many signal processing applications, the goal consists in inferring a variable of interest, 65