2023
DOI: 10.1093/jrsssb/qkac007
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian fusion: scalable unification of distributed statistical analyses

Hongsheng Dai,
Murray Pollock,
Gareth O Roberts

Abstract: There has been considerable interest in addressing the problem of unifying distributed analyses into a single coherent inference, which arises in big-data settings, when working under privacy constraints, and in Bayesian model choice. Most existing approaches relied upon approximations of the distributed analyses, which have significant shortcomings—the quality of the inference can degrade rapidly with the number of analyses being unified, and can be substantially biased when unifying analyses that do not conc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 50 publications
0
0
0
Order By: Relevance
“…Alternatively, (Bai & Chandra, 2023) described a Bayesian ensemble learning framework that uses gradient boosting by combining multiple Neural Networks trained by Markov Chain Monte Carlo (MCMC) sampling. Finally, (Dai, Pollock, & Roberts, 2023) demonstrate the robustness of Bayesian fusion by embedding the Monte Carlo fusion framework within a sequential Monte Carlo algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…Alternatively, (Bai & Chandra, 2023) described a Bayesian ensemble learning framework that uses gradient boosting by combining multiple Neural Networks trained by Markov Chain Monte Carlo (MCMC) sampling. Finally, (Dai, Pollock, & Roberts, 2023) demonstrate the robustness of Bayesian fusion by embedding the Monte Carlo fusion framework within a sequential Monte Carlo algorithm.…”
Section: Introductionmentioning
confidence: 99%