2015
DOI: 10.1016/j.ijar.2015.01.005
|View full text |Cite
|
Sign up to set email alerts
|

Monotonic uncertainty measures for attribute reduction in probabilistic rough set model

Abstract: Attribute reduction is one of the most fundamental and important topics in rough set theory. Uncertainty measures play an important role in attribute reduction. In the classical rough set model, uncertainty measures have the monotonicity with respect to the granularity of partition. However, the monotonicity of uncertainty measures does not hold when uncertainty measures in classical rough set model are directly extended into probabilistic rough set model, which makes it not so reasonable to use them to evalua… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
16
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
7
3

Relationship

1
9

Authors

Journals

citations
Cited by 96 publications
(16 citation statements)
references
References 45 publications
0
16
0
Order By: Relevance
“…In the following, we compare the three algorithms, AQRSS, CERSS and MCRSS, in the view of core attributes. For readers' convenience, we only display the core attributes with one fixed radius; given δ = 0.15, we use boundary samples to compute the core attributes [55,56], and the thinking of the process is similar to the algorithm proposed by Wang et al [56]. We removed only one attribute from the raw attributes (AT) to make the subset that is made up of the remaining attributes that cannot satisfy the constraints in definition.…”
Section: Comparisons Of Core Attributesmentioning
confidence: 99%
“…In the following, we compare the three algorithms, AQRSS, CERSS and MCRSS, in the view of core attributes. For readers' convenience, we only display the core attributes with one fixed radius; given δ = 0.15, we use boundary samples to compute the core attributes [55,56], and the thinking of the process is similar to the algorithm proposed by Wang et al [56]. We removed only one attribute from the raw attributes (AT) to make the subset that is made up of the remaining attributes that cannot satisfy the constraints in definition.…”
Section: Comparisons Of Core Attributesmentioning
confidence: 99%
“…Granular computing is a structural methodology of hierarchical computing and information processing [ 43 , 44 ], and its technology of multi-granularity and multiple levels is useful for uncertainty analyses and knowledge acquisition regarding data. In rough set theory, the information granulation is of extensive concern [ 45 , 46 , 47 , 48 , 49 ], and the granulation monotonicity plays an important role in attribute reduction [ 12 , 50 , 51 , 52 ]. In particular, a decision table acts as a formal background of data mining [ 12 , 53 , 54 , 55 ], and it involves condition/decision granules and classifications from granular structures.…”
Section: Introductionmentioning
confidence: 99%
“…Improving the Pawlak rough set model by incorporating quantitative information is a promising direction. The improved models are regarded as quantitative rough set models, and they include probabilistic rough sets (PRS) [4–11], graded rough sets (GRS) [12–14], and double‐quantitative decision‐theoretic rough set (Dq‐DTRS) [14–20] models.…”
Section: Introductionmentioning
confidence: 99%