2012
DOI: 10.1016/j.ijar.2012.02.004
|View full text |Cite
|
Sign up to set email alerts
|

An efficient rough feature selection algorithm with a multi-granulation view

Abstract: Feature selection is a challenging problem in many areas such as pattern recognition, machine learning and data mining. Rough set theory, as a valid soft computing tool to analyze various types of data, has been widely applied to select helpful features (also called attribute reduction). In rough set theory, many feature selection algorithms have been developed in the literatures, however, they are very time-consuming when data sets are in a large scale. To overcome this limitation, we propose in this paper an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
44
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 162 publications
(44 citation statements)
references
References 38 publications
0
44
0
Order By: Relevance
“…In Theorem 3, formulas (16) and (17) show the relationship between multiple multigranulation rough set and optimistic multigranulation rough set, formulas (18) and (19) show the relationship between multiple multigranulation rough set and pessimistic multigranulation rough set.…”
Section: Theorem 3 Let I Be An Information System In Which Atmentioning
confidence: 99%
See 1 more Smart Citation
“…In Theorem 3, formulas (16) and (17) show the relationship between multiple multigranulation rough set and optimistic multigranulation rough set, formulas (18) and (19) show the relationship between multiple multigranulation rough set and pessimistic multigranulation rough set.…”
Section: Theorem 3 Let I Be An Information System In Which Atmentioning
confidence: 99%
“…Secondly, asynchronous multigranulation approach: a granulation is constructed or obtained from the last granulation. For example, Qian et al 33,34 proposed a positive approximation accelerator for attribute reduction, which can make universe smaller step by step; Liang et al 19 proposed an efficient rough feature selection algorithm for large-scale data sets, which selects a valid feature subset though dividing big samples into small samples and fusing the feature selection results of small samples together, they 20 also studied the incremental feature selection mechanism by considering the monotonic increasing of samples; Wang et al presented a dimension incremental strategy for attribute reduction, in which the current information entropy can be updated by the last computation result.…”
Section: Introductionmentioning
confidence: 99%
“…Zhao et al [17] set up a fuzzy variable precision rough set model by combining the fuzzy rough set and the variable precision rough set, with the goal of establishing the fuzzy rough set as a special case. Liang et al [18] proposed an efficient attribute reduction algorithm for large-scale decision tables in which a sub-table of the large-scale data set could be considered as a single, small granularity. Wang et al [19] presented some basic structural properties of attribute reduction by covering rough sets, and developed a heuristic algorithm to find the attribute subset that approximates to be a minimal reduction.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Ten UCI machine learning datasets, the numbers of which have been magnified to be 10 5 times that of the original datasets, are selected to verify the performance of DCCAEDR, compared with such representative parallel attribute reduction algorithms as PACCA [30] and E-FSA [51]. PACCA is the parallel algorithm for computing the core attributes, and it can compute equivalence classes and reduce the search space.…”
Section: B Attribute Reduction Evaluation On Uci Datasetsmentioning
confidence: 99%
“…Qian et al (2014b) develop a new multigranulation rough set model based on "Seeking common ground while eliminating differences" (SCED) strategy, called pessimistic multigranulation rough sets based decision. Liang et al (2012) proposed an efficient rough feature selection algorithm for large-scale data sets, which was stimulated from multi-granulation rough sets.…”
mentioning
confidence: 99%