2005
DOI: 10.1088/1742-5468/2005/11/p11015
|View full text |Cite
|
Sign up to set email alerts
|

Approximate inference techniques with expectation constraints

Abstract: This article discusses inference problems in probabilistic graphical models that often occur in a machine learning setting. In particular it presents a unified view of several recently proposed approximation schemes. Expectation consistent approximations and expectation propagation are both shown to be related to Bethe free energies with weak consistency constraints, i.e. free energies where local approximations are only required to agree on certain statistics instead of full marginals.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
42
0

Year Published

2007
2007
2019
2019

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 42 publications
(42 citation statements)
references
References 19 publications
0
42
0
Order By: Relevance
“…Of course the choice of decomposition should be guided not only be tractability but also by quality of the approximation. We expect from central limit theorem (CLT) arguments that the EC approximation with this decomposition will become better the higher the number of sources with a "homogeneous" connectivity of the mixing matrix [15,16,5]. Empirically we observe that that the EC approximation is almost always more precise than the variational approximation even for quite small systems where we cannot really rely on the CLT argument.…”
Section: Expectation Consistentmentioning
confidence: 92%
See 1 more Smart Citation
“…Of course the choice of decomposition should be guided not only be tractability but also by quality of the approximation. We expect from central limit theorem (CLT) arguments that the EC approximation with this decomposition will become better the higher the number of sources with a "homogeneous" connectivity of the mixing matrix [15,16,5]. Empirically we observe that that the EC approximation is almost always more precise than the variational approximation even for quite small systems where we cannot really rely on the CLT argument.…”
Section: Expectation Consistentmentioning
confidence: 92%
“…EC and EP are closely related to the adaptive TAP framework [15,16,6,5]. In fact these non-linear iterative methods share fixed points.…”
Section: Introductionmentioning
confidence: 99%
“…EP projects the a-posteriori estimation on a Gaussian distribution with moment matching, and thus obtains a similar message update rule as Gaussian message passing (GMP) [14]- [18]. The potential connection between AMP and EP was first shown in [19], [20], in which the fixed points of EP and AMP were shown to be consistent. An EP-based AMP was proposed in [21].…”
Section: Introductionmentioning
confidence: 99%
“…This approach turns out to be equivalent to that in [6] when applied to turbo-equalization [5]. In [7] a combined use of Gaussian expectation propagation (EP) [8], [9] and BP is proposed. The use of EP, however, leads to an unstable algorithm due to the fact that computed Gaussian EP messages may have a negative variance.…”
Section: Introductionmentioning
confidence: 99%