2002
DOI: 10.1147/sj.413.0428
|View full text |Cite
|
Sign up to set email alerts
|

A decision-tree-based symbolic rule induction system for text categorization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0
2

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 88 publications
(31 citation statements)
references
References 13 publications
0
26
0
2
Order By: Relevance
“…The transformation of decision trees to rule-based classifiers is discussed generally in [106], and for the particular case of text data in [68]. For each path in the decision tree a rule can be generated, which repre-sents the conjunction of the predicates along that path.…”
Section: Rule-based Classifiersmentioning
confidence: 99%
“…The transformation of decision trees to rule-based classifiers is discussed generally in [106], and for the particular case of text data in [68]. For each path in the decision tree a rule can be generated, which repre-sents the conjunction of the predicates along that path.…”
Section: Rule-based Classifiersmentioning
confidence: 99%
“…1b) the point X best-fits into class 1 (blue circles) according to a majority vote of the five nearest points. A decision tree (Apte et al 1998;Johnson et al 2002) is a method of visually and explicitly representing decisions and decision making. The goal is to create a model that predicts the value of a target variable, based on several input variables.…”
Section: Text Categorization Based On Machine Learning Methodsmentioning
confidence: 99%
“…A decision-tree learning algorithm based on Gini impurity is used for classification on the basis of its readily-interpretable results (rule induction using decision tree algorithms [17], has the characteristic of direct translatability into logical clauses). Gini impurity measures the degree of impurity in a given dataset comprising multiple class labels, i.e., it measures the probability of a randomly chosen element to be incorrectly labelled given a subset of randomly distributed class labels.…”
Section: A Decision Tree Learningmentioning
confidence: 99%