2005
DOI: 10.1109/tevc.2004.837752
|View full text |Cite
|
Sign up to set email alerts
|

Handling Continuous Attributes in an Evolutionary Inductive Learner

Abstract: This paper analyzes experimentally discretization algorithms for handling continuous attributes in evolutionary learning. We consider a learning system that induces a set of rules in a fragment of first-order logic (evolutionary inductive logic programming), and introduce a method where a given discretization algorithm is used to generate initial inequalities, which describe subranges of attributes' values. Mutation operators exploiting information on the class label of the examples (supervised discretization)… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2008
2008
2016
2016

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(10 citation statements)
references
References 38 publications
0
10
0
Order By: Relevance
“…This method identifies the damage on the basis of the sound emission spectral characteristics and is sufficient for recognition of injuries taking into account not large number of measurements. In the literature many studies regarding other methods of failure identification may be found ( It is possible to use heuristic search methods in order to create the method of decision trees (Divina, Marchiori, 2005). Actions undertaken while the problem is being solved can be classified as searching of objects of the specified characteristics.…”
Section: Methods Of Decision Treesmentioning
confidence: 99%
“…This method identifies the damage on the basis of the sound emission spectral characteristics and is sufficient for recognition of injuries taking into account not large number of measurements. In the literature many studies regarding other methods of failure identification may be found ( It is possible to use heuristic search methods in order to create the method of decision trees (Divina, Marchiori, 2005). Actions undertaken while the problem is being solved can be classified as searching of objects of the specified characteristics.…”
Section: Methods Of Decision Treesmentioning
confidence: 99%
“…With the inductive approach, the classification rules are obtained from examples, where each object or example belongs to a known labeled class. The goal of this inductive approach is to ultimately generate classification models that assign new objects to the right class 22. Decision tree (DT), Support vector machine (SVM),23,24 naive Bayes (NB), Bayesian networks (BN), neural networks (NN), and k ‐nearest neighbor ( k ‐nn) are well‐known examples of the inductive approach 2,22,23.…”
Section: Proaftn Methodsmentioning
confidence: 99%
“…However, the main problem associated with MCDA is that the classification models do not automatically result only from the vectors describing the objects but depend also on the judgment of a DM. The DM defines the “boundaries” of the attributes and the weights which define the importance of each attribute in the data set 17,22. However, it is usually difficult for a DM to assign accurate quantitative values to these parameters.…”
Section: Introductionmentioning
confidence: 99%
“…In this case the EA is effectively doing a discretization of continuous values "on-the-fly", since by creating rule conditions such as "30K ≤ Salary ≤ 50K the EA is effectively producing discrete intervals. The effectiveness of an EA that directly copes with continuous attributes can be improved by using operators that enlarge or shrink the intervals based on concepts and methods borrowed from the research area of discretization in data mining (Divina & Marchiori 2005).…”
Section: Individual Representation For Classification-rule Discoverymentioning
confidence: 99%