“…Prior distributions are chosen to pull Markov chain Monte Carlo (MCMC) samples away from inappropriate results that are consistent with the likelihood but would not be consistent with domain knowledge [51]. By using scaled sigmoid Gaussian prior probability bounded by θ min and θ max , we only search for solutions constrained in an appropriate interval [52]. By virtue of the Bayes theorem the joint posterior probability for θ and f satisfies Pfalse(f,m,σϵ,κ,σf,lfalsefalse|yfalse)∝Pfalse(yfalsefalse|f,σϵ,κfalse)Pfalse(ffalsefalse|m,σf,lfalse)Pfalse(mfalse)Pfalse(σϵfalse)Pfalse(κfalse)Pfalse(σffalse)Pfalse(lfalse),which we draw random samples from by MCMC sampling, more specifically block Gibbs sampling with elliptical slice sampling at each block [52,53] (electronic supplementary material, appendix C).…”