2017
DOI: 10.1007/978-3-319-60840-2_30
|View full text |Cite
|
Sign up to set email alerts
|

Resolving the Conflicts Between Cuts in a Decision Tree with Verifying Cuts

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…Classifying algorithms are applied to solve problems in machine learning, pattern recognition, expert systems, knowledge discovery, and data mining 1 . There are several ways to construct classifiers, which rely on decision rules, decision trees, neural networks, inductive logic programming, and classical or modern statistical methods 1–5 . Data sets used in classifiers may be presented by data tables where objects correspond to rows and attributes correspond to columns of the given data table.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Classifying algorithms are applied to solve problems in machine learning, pattern recognition, expert systems, knowledge discovery, and data mining 1 . There are several ways to construct classifiers, which rely on decision rules, decision trees, neural networks, inductive logic programming, and classical or modern statistical methods 1–5 . Data sets used in classifiers may be presented by data tables where objects correspond to rows and attributes correspond to columns of the given data table.…”
Section: Introductionmentioning
confidence: 99%
“…1 There are several ways to construct classifiers, which rely on decision rules, decision trees, neural networks, inductive logic programming, and classical or modern statistical methods. [1][2][3][4][5] Data sets used in classifiers may be presented by data tables where objects correspond to rows and attributes correspond to columns of the given data table . In this contribution we consider decision tables of the form U A d T = ( , , ), 6 where A is the set of attributes, U is the set of objects, and d is a decision attribute (distinguished from the set A).…”
mentioning
confidence: 99%