1997
DOI: 10.1007/3-540-63223-9_138
|View full text |Cite
|
Sign up to set email alerts
|

Induction of strong feature subsets

Abstract: A b s t r a c t The problem of features subset selection can be defined as the selection of a relevant subset of features which allows a leanaing algorithm to induce small high-accuracy concepts. To achieve this goal, two main approaches have been developed, the first one is algorithm-independent (filter approach) which considers only the data, when the second approach takes into account both the data and a given learning algorithm (wrapper approach). Recent work were developed to study the interest of rough s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

1998
1998
2022
2022

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…: when alpha increases, the size of reducts decreases and the number of reducts generally increases; (2) we obtain the classical reducts (corresponding to classical framework of rough sets) when a=0; (3) the best reduct with highest accuracy is obtained when a > 0 i.e. using a-RST concepts [16]; (4) theoretically, the algorithm can find N ([N/2J) reducts.…”
Section: C~-reducts Algorithmmentioning
confidence: 95%
See 2 more Smart Citations
“…: when alpha increases, the size of reducts decreases and the number of reducts generally increases; (2) we obtain the classical reducts (corresponding to classical framework of rough sets) when a=0; (3) the best reduct with highest accuracy is obtained when a > 0 i.e. using a-RST concepts [16]; (4) theoretically, the algorithm can find N ([N/2J) reducts.…”
Section: C~-reducts Algorithmmentioning
confidence: 95%
“…In general, they can be classified into two categories: (1) the filter approach, which serves as a filter to sieve the irrelevant and/or redundant features without taking into account the induction algorithm [1][5] [10]; and (2) the wrapper approach, which uses the induction algorithm itself as a black box in the phase of attributes selection to select a good features subset which improve the performance, i.e. the accuracy of the induction algorithm [4] [8][11] [16]. Although the wrapper approach has significantly improved the accuracy of well known algorithms, like C4.5 and Naive-Bayes, its generalization is limited for many reasons i.e.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation