2003
DOI: 10.1007/978-3-540-45062-7_49
|View full text |Cite
|
Sign up to set email alerts
|

The Hugin Tool for Learning Bayesian Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0
2

Year Published

2007
2007
2023
2023

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 68 publications
(32 citation statements)
references
References 8 publications
0
30
0
2
Order By: Relevance
“…For instance, the necessary path condition (NPC) algorithm, which is an enhancement of the PC algorithm, allows the user to interactively decide on the directionality of the undirected links. The disadvantage of both PC and NPC algorithms is that their outcomes are structures and not conditional probability matrices (Madsen et al 2004). Friedman et al (1999) suggest an algorithm that limits the search space for learning a BN by restricting the parents of each variable to belong to a small subset of candidates.…”
Section: Bayesian Networkmentioning
confidence: 99%
“…For instance, the necessary path condition (NPC) algorithm, which is an enhancement of the PC algorithm, allows the user to interactively decide on the directionality of the undirected links. The disadvantage of both PC and NPC algorithms is that their outcomes are structures and not conditional probability matrices (Madsen et al 2004). Friedman et al (1999) suggest an algorithm that limits the search space for learning a BN by restricting the parents of each variable to belong to a small subset of candidates.…”
Section: Bayesian Networkmentioning
confidence: 99%
“…To improve performance, fractional updating is only performed for π ij when P (π ij ) > 0 (an update would leave θ * ijk and α ij unchanged when P (π ij ) = 0)), see [10]. Both fractional updating and Online EM perform no update when α ij = 0.…”
Section: Parameter Learning Algorithmsmentioning
confidence: 99%
“…A similar gradient descent approach is described in [5]. [10] describes how the approach of [16] referred to as sequential learning has been implemented in the HUGIN tool. The online EM algorithm [2] is a stochastic gradient method that is faster than other gradient methods such as [19] which involves a difficult task of determining the step size between iterations.…”
Section: Introductionmentioning
confidence: 99%
“…The algorithms were implemented using HUGIN software version 8.0 [17,18] and MPI. The HUGIN software does not have any special features necessary for implementing the ideas presented in this paper.…”
Section: Fyrkatmentioning
confidence: 99%