2017
DOI: 10.1088/1367-2630/aa77cf
|View full text |Cite
|
Sign up to set email alerts
|

Structured filtering

Abstract: A major challenge facing existing sequential Monte Carlo methods for parameter estimation in physics stems from the inability of existing approaches to robustly deal with experiments that have different mechanisms that yield the results with equivalent probability. We address this problem here by proposing a form of particle filtering that clusters the particles that comprise the sequential Monte Carlo approximation to the posterior before applying a resampler. Through a new graphical approach to thinking abou… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 32 publications
0
5
0
Order By: Relevance
“…A practical difficulty is that post-processing the Bayesian distribution requires to evaluate it on a grid of φ values, which can be problematic for broad distributions and in multiparameter scenarios. An efficient Bayesian inference for the estimation of multiple parameters [51,266,267] thus requires using approximate methods, such as particle-filtering and Sequential Monte Carlo methods [60,265,268], or structured filtering [269] when dealing with multiple, equivalent optima.…”
Section: Box A: Bayesian Inferencementioning
confidence: 99%
“…A practical difficulty is that post-processing the Bayesian distribution requires to evaluate it on a grid of φ values, which can be problematic for broad distributions and in multiparameter scenarios. An efficient Bayesian inference for the estimation of multiple parameters [51,266,267] thus requires using approximate methods, such as particle-filtering and Sequential Monte Carlo methods [60,265,268], or structured filtering [269] when dealing with multiple, equivalent optima.…”
Section: Box A: Bayesian Inferencementioning
confidence: 99%
“…Inheritance rules can be seen as an extreme pruning rule for all the other nodes in µ (Fig. 1d), similarly to what adopted for a graphical model exploration in [52]. Therefore, another way to mitigate the greediness of the approach is to discard only those particularly unsuccessful models Ĥr , that have: B rj < b , ∀j in the same layer and b a user-defined threshold.…”
Section: Scalability and Generalisationmentioning
confidence: 99%
“…The reason to adopt a logarithmic likelihood is here that likelihoods are prone to numerical instabilities as the number of sample measurement grows [50], so that adopting a log-difference instead of a ratio can help reducing artefacts. BFs are known to be a statistically robust measure for comparing the predictive power of different models, whilst favourably weighting less structure to limit overfitting [34], and have been successfully used in the context of resolving multi-modal distributions in [52]. The resulting B ij are stored as a comparative Directed Acyclic Graph (cDAG) representation across the same nodes, where the edges' directionality maps the sign of B ij − 1, pointing towards the model favoured by statistical evidence (Fig.…”
Section: Hyperparameterised Modelling Of Hahn Echo Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…In practice, we use a method proposed by Liu and West [29] to move the particles but other methods exist and we recommend reviewing [28,30,31] for more details. Here, we will use the implementation of particle filtering and Liu-West resampling provided by the QInfer package [32].…”
Section: Approximate Bayesian Inferencementioning
confidence: 99%

Bayesian ACRONYM Tuning

Gamble,
Granade,
Wiebe
2019
Preprint
Self Cite