2017
DOI: 10.1016/j.amc.2016.12.025
|View full text |Cite
|
Sign up to set email alerts
|

Neural network implementation of inference on binary Markov random fields with probability coding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
10
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
1
1

Relationship

4
1

Authors

Journals

citations
Cited by 7 publications
(12 citation statements)
references
References 15 publications
1
10
1
Order By: Relevance
“…Similar to the studies in [26,27], we assume that J ij (x i , x j ) = J ij x i x j and observations. In this paper, we only consider marginal inference.…”
Section: Markov Random Fields and Marginal Inferencementioning
confidence: 99%
See 2 more Smart Citations
“…Similar to the studies in [26,27], we assume that J ij (x i , x j ) = J ij x i x j and observations. In this paper, we only consider marginal inference.…”
Section: Markov Random Fields and Marginal Inferencementioning
confidence: 99%
“…Similar to the studies in [26,27], we only consider inference of binary MRFs in this paper, which means the value of the variable x i can be 1 or -1 (x i = 1 or −1).…”
Section: Converting Mean-field Inference Into a Differential Equationmentioning
confidence: 99%
See 1 more Smart Citation
“…Beck and Pouget [7] went a future step to solve the approximation problem and came up with a precise equivalence relation. Similarly, Ott et al [8] and Yu et al [9] built the relationship between inference equation of Markov random fields and the dynamics of recurrent neural networks with BP. The above works based on equivalence proof are only appropriate for small-scale Bayesian inference.…”
Section: Introductionmentioning
confidence: 99%
“…In this way, the summation of probabilities can be calculated by summing the overall responses of neurons. The same way of coding was also used in [27]. In order to simplify the multiplication of probabilities, Rao [28], [29] proposed to use log probability code and proved that the differential equations of recurrent neural networks are in coincidence with the inference equations of hidden Markov model, in which the computation of sum-logs was used to approximate the computation of log-sum.…”
mentioning
confidence: 99%