2017
DOI: 10.1017/s1471068417000333
|View full text |Cite
|
Sign up to set email alerts
|

A new algorithm to automate inductive learning of default theories

Abstract: In inductive learning of a broad concept, an algorithm should be able to distinguish concept examples from exceptions and noisy data. An approach through recursively finding patterns in exceptions turns out to correspond to the problem of learning default theories. Default logic is what humans employ in commonsense reasoning. Therefore, learned default theories are better understood by humans. In this paper, we present new algorithms to learn default theories in the form of non-monotonic logic programs. Experi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

6
1

Authors

Journals

citations
Cited by 12 publications
(17 citation statements)
references
References 17 publications
0
17
0
Order By: Relevance
“…The FOLD algorithm [19,20] Example 1 In the FOLD-R++ algorithm, the target is to learn rules for fly(X).…”
Section: The Fold-r++ Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…The FOLD algorithm [19,20] Example 1 In the FOLD-R++ algorithm, the target is to learn rules for fly(X).…”
Section: The Fold-r++ Algorithmmentioning
confidence: 99%
“…The FOLD-R algoirthm [19,20] selects the best literal based on the weighted information gain for learning defaults, similar to the original FOLD algorithm described in [20]. For numeric features, the FOLD-R algorithm would enumerate all the possible splits.…”
Section: Literal Selectionmentioning
confidence: 99%
“…There are number of directions for future work: (i) Equation 7can measure dissimilarity too. Dissimilarity is usually expressed as negation-as-failure in ILP (Shakerin et al 2017). We plan to incorporate dissimilarity to support vectors of the opposite class as a way to reduce the number of induced clauses as well as increase the accuracy of each clause.…”
Section: Related Workmentioning
confidence: 99%
“…In such a case, we will be learning answer set programs (Gelfond and Kahl 2014). Learning answer set programs has an added advantage of being able to distinguish between exceptions and noise (Shakerin et al 2017;Shakerin and Gupta 2020). (ii) In sparse regions of training data, specialization is usually stopped too early.…”
Section: Related Workmentioning
confidence: 99%
“…In this section we present the SHAP-FOLD algorithm. SHAP-FOLD learns a concept in terms of a default theory [15]. A default theory is a non-monotonic logic theory to formalize reasoning with default assumptions in absence of complete information.…”
Section: Shap-fold Algorithmmentioning
confidence: 99%