2016
DOI: 10.1109/tsp.2015.2493979
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Variational Bayesian Algorithms Over Sensor Networks

Abstract: Distributed inference/estimation in Bayesian framework in the context of sensor networks has recently received much attention due to its broad applicability. The variational Bayesian (VB) algorithm is a technique for approximating intractable integrals arising in Bayesian inference. In this paper, we propose two novel distributed VB algorithms for general Bayesian inference problem, which can be applied to a very general class of conjugate-exponential models. In the first approach, the global natural parameter… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
34
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 41 publications
(34 citation statements)
references
References 53 publications
0
34
0
Order By: Relevance
“…For the Bayesian learning problem in wireless sensor networks, references [10,13,19] systematically studied how to solve the Bayesian learning problem by using the variational Bayes method in a distributed environment. For the problem of Bayesian inference and estimation on the network, reference [10] proposed a general framework of distributed variational seeing algorithm for conjugate exponential family models. For the joint sparse signal recovery problem in sensor networks, a distributed variational Bayesian algorithm based on quantized communication and inaccurate ADM is proposed in [19].…”
Section: Distributed Bayesian Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…For the Bayesian learning problem in wireless sensor networks, references [10,13,19] systematically studied how to solve the Bayesian learning problem by using the variational Bayes method in a distributed environment. For the problem of Bayesian inference and estimation on the network, reference [10] proposed a general framework of distributed variational seeing algorithm for conjugate exponential family models. For the joint sparse signal recovery problem in sensor networks, a distributed variational Bayesian algorithm based on quantized communication and inaccurate ADM is proposed in [19].…”
Section: Distributed Bayesian Learningmentioning
confidence: 99%
“…Therefore, the real-time online information processing, data analysis, and dynamic optimization of agents are accomplished [8,9]. Intelligent network is a highly intelligent distributed network system, in which each agent is able to self-recognition, self-judgment, and self-adjusting by network topology [10]. At individual level, intelligent network requires each agent not only can perceive its environment information, but also can perform network communication through its connection to other agents to realize ongoing reinforcement learning [11].…”
Section: Introductionmentioning
confidence: 99%
“…A utilization of the dispersed inference/estimation of a Bayesian Gaussian mixture method is then displayed, to assess the effectiveness of the aimed algorithms. Simulations on both real datasets and synthetic exhibit that the aimed algorithms have brilliant execution, which is nearly in the same class as the relating centralized VB algorithm depending on all information accessible in a fusion center [5]. The issue of robust transmission of detected information through an immense field of little and vulnerable sensors towards sink nodes.…”
Section: Introductionmentioning
confidence: 96%
“…Some distributed variational inference (DVI) algorithms have been proposed to estimate the posterior over networks [18]- [20]. DVI with Gaussian mixture model (DVI-GMM) [18], [20] is proposed to approximate the multi-modality posterior by distributed optimization methods. Although DVI-GMM can successfully capture the multi-modality posterior over networks, it still faces some challenges from both the computational complexity and theory.…”
Section: Introductionmentioning
confidence: 99%