2008
DOI: 10.1142/s0218488508005121
|View full text |Cite
|
Sign up to set email alerts
|

Combination Entropy and Combination Granulation in Rough Set Theory

Abstract: Based on the intuitionistic knowledge content nature of information gain, the concepts of combination entropy and combination granulation are introduced in rough set theory. The conditional combination entropy and the mutual information are defined and their several useful properties are derived. Furthermore, the relationship between the combination entropy and the combination granulation is established, which can be expressed as CE(R) + CG(R) = 1. All properties of the above concepts are all special instances… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
94
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 153 publications
(94 citation statements)
references
References 11 publications
0
94
0
Order By: Relevance
“…To overcome this limitation, we modify formula (8) based on the idea of combination entropy and combination granulation [42,45], and the definition of combined fuzzinessbased uncertainty measure is as follows.…”
Section: Modified Uncertainty Measure Of Rough Fuzzy Setsmentioning
confidence: 99%
“…To overcome this limitation, we modify formula (8) based on the idea of combination entropy and combination granulation [42,45], and the definition of combined fuzzinessbased uncertainty measure is as follows.…”
Section: Modified Uncertainty Measure Of Rough Fuzzy Setsmentioning
confidence: 99%
“…Therefore, feature selection techniques have been extensively studied in the past decades. Some well-known techniques, such as attribute frequency of discernibility matrix (Guan et al 2007;Wang and Wang 2001;Zhang et al 2003aZhang et al , 2003bZhang et al , 2003c, positive region, i.e., dependence (Chouchoulas and Shen 2001;Modrzejewski 1993), mutual information (Peng et al 2005;Slezak 2002), consistency method (Dash and Liu 2003), various information entropy reduction (Slezak 2002;Liang and Xu 2002;Qian and Liang 2008), have been developed based on this optimal algorithm. Zhang et al (2003a,b,c) proposed an algorithm based on the discernibility matrix, which employed appearing frequency of attribute as heuristic information to select attributes and eliminate redundant attributes whose time complexity was O(|C| 2 |U | 2 ).…”
Section: Related Workmentioning
confidence: 99%
“…proposed a new conditional information entropy that can reflect the change of the condition attributes distribution in the process of reduction with respect to the positive region and presented a new representation of attribute reduction. Qian and Liang (2008) presented the definitions of the conditional combination entropy and the mutual information, researched several useful properties, and built a heuristic function in the heuristic reduction algorithm with respect to combination entropy.…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, some heuristic algorithms that can find one reduct in a shorter time were proposed in Refs. [11,13,24,25,39,43,45,46,48], most of which are greedy and forward search algorithms. Starting with a nonempty set, these search algorithms keep adding one or several attributes of high significance into a pool at each iteration until the dependence no longer increases.…”
Section: Reduction Algorithm To Small Granularitymentioning
confidence: 99%
“…These two attribute reduction algorithms are usually computationally very expensive, especially for dealing with large-scale data sets of high dimensions. Therefore, to overcome this difficulty, many heuristic attribute reduction algorithms have been developed in rough set theory [11,13,24,25,39,35,43,45,46,48]. A heuristic attribute reduction algorithm can extract a single reduct from a given table in a relatively short time.…”
Section: Introductionmentioning
confidence: 99%