2007 IEEE International Symposium on Information Theory 2007
DOI: 10.1109/isit.2007.4557602
|View full text |Cite
|
Sign up to set email alerts
|

On Variational Message Passing on Factor Graphs

Abstract: In this paper, it is shown how (naive and structured) variational algorithms may be derived from a factor graph by mechanically applying generic message computation rules; in this way, one can bypass error-prone variational calculus. In prior work by Bishop et al., Xing et al., and Geiger, directed and undirected graphical models have been used for this purpose. The factor graph notation amounts to simpler generic variational message computation rules; by means of factor graphs, variational methods can straigh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
169
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 156 publications
(173 citation statements)
references
References 11 publications
0
169
0
Order By: Relevance
“…This paper follows the variational message passing (VMP) formulation from [15], which defines a recipe for iterative message updates, such that the variational approximation is guaranteed to converge to a local minimum of the KL divergence. Fig.…”
Section: Inference By Variational Message Passingmentioning
confidence: 99%
See 2 more Smart Citations
“…This paper follows the variational message passing (VMP) formulation from [15], which defines a recipe for iterative message updates, such that the variational approximation is guaranteed to converge to a local minimum of the KL divergence. Fig.…”
Section: Inference By Variational Message Passingmentioning
confidence: 99%
“…3. Variational message updates are computed in accordance with [15], with shorthand notations for the mean 1 Derivations are available at http://biaslab.github.io/pdf/slf/supplement.pdf f (·) = E q(·) [f (·)], covariance Cov[f (·)] = Cov q(·) [f (·)], and average energy U[f (·)] = −E q(·) [log f (·)]. Table I summarizes the variational message updates for the Bernoulli node, with switch z ∈ {0, 1} and parametrized by π ∈ [0, 1] through node function f (z, π) = π z (1 − π) 1−z .…”
Section: Appendixmentioning
confidence: 99%
See 1 more Smart Citation
“…The position posterior pdf p (x r |D) of any mobile sensor r ∈ V M can now be estimated via message passing methods [7]- [14]. Two common message passing methods adapted to graphs as depicted in Figure 2 are displayed in Figure 3: the SP algorithm, which implements BP [9], and VMP, which implements the variational Bayesian method [12].…”
Section: A Message Passing On Factor Graphsmentioning
confidence: 99%
“…For continuous hidden variables, evaluation of (6), (8) and (9) (see Figure 3a) can become arbitrarily complex [3], [7], [12], [13]. A way to control this is to restrict the messages passed between the nodes to be Gaussian [7].…”
Section: B the Sum-product Algorithmmentioning
confidence: 99%