Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence 2021
DOI: 10.24963/ijcai.2021/587
|View full text |Cite
|
Sign up to set email alerts
|

Partition Function Estimation: A Quantitative Study

Abstract: Probabilistic graphical models have emerged as a powerful modeling tool for several real-world scenarios where one needs to reason under uncertainty. A graphical model's partition function is a central quantity of interest, and its computation is key to several probabilistic reasoning tasks. Given the #P-hardness of computing the partition function, several techniques have been proposed over the years with varying guarantees on the quality of estimates and their runtime behavior. This paper seeks to present a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 13 publications
(22 reference statements)
1
4
0
Order By: Relevance
“…Also, they give an estimate of all the queries of interest. These results agree with the results for PR reported in a recent evaluation of various approximate methods (Agrawal, Pote, & Meel, 2021). As a consistency check, we used both these methods for comparison.…”
Section: Methods Used For Comparisonsupporting
confidence: 88%
See 1 more Smart Citation
“…Also, they give an estimate of all the queries of interest. These results agree with the results for PR reported in a recent evaluation of various approximate methods (Agrawal, Pote, & Meel, 2021). As a consistency check, we used both these methods for comparison.…”
Section: Methods Used For Comparisonsupporting
confidence: 88%
“…Both libDAI and Merlin are robust and have been used for comparison with other methods (Agrawal et al, 2021;Lin et al, 2020).…”
Section: Methods Used For Comparisonmentioning
confidence: 99%
“…To evaluate our PMC-based tool for performing probabilistic inference on BNs, we took the state-of-the-art BN inference tool Ace as baseline. Ace 9 is developed by Darwiche's group that is the state-of-the-art for probabilistic inference on Bayesian networks; see the recent study (Agrawal, Pote, & Meel, 2021). It takes a BN as input and compiles it into an arithmetic circuit (AC).…”
Section: Probabilistic Inferencementioning
confidence: 99%
“…The Boolean formula is then compiled into a tractable Boolean circuit (deterministic, decomposable and smooth), from which a tractable arithmetic circuit known as an AC is finally extracted. This approach is implemented by the ACE system, 8 which was recently evaluated in [1] and shown to exhibit state-of-the-art performance; see also [52].…”
Section: Smoothnessmentioning
confidence: 99%
“…These properties of tractable circuits made them very suitable for integration with modern pipelines for machine learning and neuro-symbolic AI; see, for example, [51,112,111,70,22,72,92,54,59,53] where tractable circuits have been recently employed in and/or integrated with neural networks, deep reinforcement learning, Bayesian network classifiers and (deep) probabilistic logic programs. 1 Even the traditional offline/online divide that originally motivated knowledge compilation for reasoning is now being exploited in modern settings as it is aligned with the training/inference divide that governs modern AI systems; see, e.g., [51]. As such, a recent trend has emerged in which tools and techniques that were initially envisioned for reasoning tasks are now being employed to facilitate learning and its integration with knowledge and reasoning.…”
mentioning
confidence: 99%