2021
DOI: 10.1016/j.ins.2021.08.067
|View full text |Cite
|
Sign up to set email alerts
|

Granular-conditional-entropy-based attribute reduction for partially labeled data with proxy labels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 30 publications
(3 citation statements)
references
References 73 publications
0
3
0
Order By: Relevance
“…For other types of data, it is necessary to remove redundant and noisy attributes. In addition, it can be seen from Figure 7 that compared with other different attribute reduction algorithms (J CG [41], J CH [42], J MG , J MH [43], J JG , J JH , z, x + y + z [44], x + z [19], z-y [45]), the MCIR algorithm (red dotted line) designed in this paper achieves better accuracy with fewer attributes. Since the MCIR model removes as many redundant or noisy attributes from the data as possible and achieves data optimization through data dimensionality reduction, making the reduced data better for anomaly detection tasks, the model can maintain or even improve the accuracy of performance detection while reducing the time complexity.…”
Section: Attribute Reduction Under Mcir Algorithmmentioning
confidence: 94%
“…For other types of data, it is necessary to remove redundant and noisy attributes. In addition, it can be seen from Figure 7 that compared with other different attribute reduction algorithms (J CG [41], J CH [42], J MG , J MH [43], J JG , J JH , z, x + y + z [44], x + z [19], z-y [45]), the MCIR algorithm (red dotted line) designed in this paper achieves better accuracy with fewer attributes. Since the MCIR model removes as many redundant or noisy attributes from the data as possible and achieves data optimization through data dimensionality reduction, making the reduced data better for anomaly detection tasks, the model can maintain or even improve the accuracy of performance detection while reducing the time complexity.…”
Section: Attribute Reduction Under Mcir Algorithmmentioning
confidence: 94%
“…The conditional entropy is another important measure corresponding to neighborhood rough set, which can characterize the discriminating performance of ∀A ⊆ AT with respect to d. Thus far, various forms of conditional entropy [50][51][52][53] have been proposed in respect to different requirements. A special form which is widely used is shown below.…”
Section: Conditional Entropymentioning
confidence: 99%
“…There are two mainstream learning perspectives, i.e., supervised learning and unsupervised learning. Then, we pick the approximation quality [34] and conditional entropy [19,[35][36][37][38][39] as two custom measures to better comprehend and investigate the essence of attribute reduction in terms of the neighborhood rough set.…”
Section: Attribute Reductionmentioning
confidence: 99%