2003
DOI: 10.1002/int.10085
|View full text |Cite
|
Sign up to set email alerts
|

An iterated local search algorithm for learning Bayesian networks with restarts based on conditional independence tests

Abstract: A common approach for learning Bayesian networks (BNs) from data is based on the use of a scoring metric to evaluate the fitness of any given candidate network to the data and a method to explore the search space, which usually is the set of directed acyclic graphs (DAGs). The most efficient search methods used in this context are greedy hill climbing, either deterministic or stochastic. One of these methods that has been applied with some success is hill climbing with random restart. In this article we study … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2008
2008
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(31 citation statements)
references
References 14 publications
0
31
0
Order By: Relevance
“…ILS is used as a local search procedure within a GRASP approach by Ribeiro and Urrutia for tackling the mirrored traveling tournament problem [74]. Very high performing ILS algorithms have also been proposed for problems such as maximum clique [50], image registration [24], some loop layout problems [10], linear ordering [19,78], logistic network design problems [22], a capacitated hub location problem [75], Bayesian networks structure learning [25], and minimum sum-of-squares clustering [64].…”
Section: Max-satmentioning
confidence: 99%
“…ILS is used as a local search procedure within a GRASP approach by Ribeiro and Urrutia for tackling the mirrored traveling tournament problem [74]. Very high performing ILS algorithms have also been proposed for problems such as maximum clique [50], image registration [24], some loop layout problems [10], linear ordering [19,78], logistic network design problems [22], a capacitated hub location problem [75], Bayesian networks structure learning [25], and minimum sum-of-squares clustering [64].…”
Section: Max-satmentioning
confidence: 99%
“…Results suggest that a number of BBN models performed exceptionally well as classifiers for the ''Victimization'' attribute. In fact, all listed versions of the local hill climbers and local K2 search algorithms (de Campos et al 2003;Madden 2003) led to classification performances with 97% or better accuracy. It is noted that the build time for these algorithms are on the same order and therefore, the build time was not considered as a further significant factor in performance comparisons.…”
Section: Development Of Bayesian Belief Network Model For Ncvsmentioning
confidence: 97%
“…One of them is a greedy hill-climbing search with random restarts [27]. A random network model is selected for each restart.…”
Section: õ õmentioning
confidence: 99%