2016
DOI: 10.1007/s10955-016-1566-0
|View full text |Cite
|
Sign up to set email alerts
|

Cycle-Based Cluster Variational Method for Direct and Inverse Inference

Abstract: We elaborate on the idea that loop corrections to belief propagation could be dealt with in a systematic way on pairwise Markov random fields, by using the elements of a cycle basis to define region in a generalized belief propagation setting. The region graph is specified in such a way as to avoid dual loops as much as possible, by discarding redundant Lagrange multipliers, in order to facilitate the convergence, while avoiding instabilities associated to minimal factor graph construction. We end up with a tw… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 43 publications
0
6
0
Order By: Relevance
“…To do so we presented a new way of introducing the messages in the CVM that differs from standard parent-to-child messages in that messages are sent to a region from all its ancestors, and not only by its direct parents. While previous attempts to fix the guage invariance in GBP equations [18,19,21] relied on the idea of removing some selected messages from the equations, our approach increases the number of such messages, but with a restriction on their degrees of freedom.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…To do so we presented a new way of introducing the messages in the CVM that differs from standard parent-to-child messages in that messages are sent to a region from all its ancestors, and not only by its direct parents. While previous attempts to fix the guage invariance in GBP equations [18,19,21] relied on the idea of removing some selected messages from the equations, our approach increases the number of such messages, but with a restriction on their degrees of freedom.…”
Section: Discussionmentioning
confidence: 99%
“…At the end, it all amounts to discovering which are the redundant messages, and remove them from the representation, or set them to an arbitrary value, fixing the gauge [18,19,21]. In many case, although the final objective is clear (destroying the loops), there are many different ways to achieve it, and each one has selected his own way.…”
Section: A Moment Matching Is Gauge Freementioning
confidence: 99%
See 2 more Smart Citations
“…We considered two types of models involving either latent binary variables [21] or Gaussian copula models. Both come with an associated learning algorithm to generate models adapted respectively to Generalized Belief Propagation for binary variables [22] and Gaussian Belief Propagation for realvalued variables [23].…”
Section: Our Approachmentioning
confidence: 99%