1990
DOI: 10.1007/3-540-52062-7_82
|View full text |Cite
|
Sign up to set email alerts
|

Techniques for efficient empirical induction

Abstract: This paper describes the LEI algorithm for empirical induction. The LEI algorithm provides efficient empirical induction for discrete attribute value data. It derives a classification procedure in the form of a set of predicate logic classification rules. This contrasts with the only other efficient approach to exhaustive empirical induction, the derivatives of the CLS algorithm, which present their classification procedures in the form of a decision tree. The LEI algorithm will always find the simplest nondis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

1995
1995
1995
1995

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 5 publications
0
3
0
Order By: Relevance
“…For each such search, the objects belonging to the class in question were treated as the positive objects and all other objects in the data set were treated as negative objects. This search was performed using each of the following search methods-OPUS o ; OPUS o without optimistic pruning; OPUS o without other pruning; OPUS o without optimistic reordering; and fixed-order search, such as performed by Clearwater and Provost (1990), Rymon (1993), Schlimmer (1993), Segal and Etzioni (1994) and Webb (1990).…”
Section: Experimental Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…For each such search, the objects belonging to the class in question were treated as the positive objects and all other objects in the data set were treated as negative objects. This search was performed using each of the following search methods-OPUS o ; OPUS o without optimistic pruning; OPUS o without other pruning; OPUS o without optimistic reordering; and fixed-order search, such as performed by Clearwater and Provost (1990), Rymon (1993), Schlimmer (1993), Segal and Etzioni (1994) and Webb (1990).…”
Section: Experimental Methodsmentioning
confidence: 99%
“…In particular, when expanding a node n in a search tree, the OPUS algorithms seek to identify search operators that can be excluded from consideration in the search tree descending from n without excluding a sole goal node from that search tree. The OPUS algorithms differ from most previous admissible search algorithms employed in machine learning (Clearwater & Provost, 1990;Murphy & Pazzani, 1994;Rymon, 1992;Segal & Etzioni, 1994;Webb, 1990) in that when such operators are identified, they are removed from consideration in all branches of the search tree that descend from the current node. In contrast, the other algorithms only remove a single branch at a time without altering the operators considered below sibling branches, thereby pruning fewer nodes from the search space.…”
Section: Unordered Search Spacesmentioning
confidence: 99%
See 1 more Smart Citation