2007
DOI: 10.1016/j.ins.2006.11.013
|View full text |Cite
|
Sign up to set email alerts
|

A novel approach to fuzzy rough sets based on a fuzzy covering☆

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
77
0
1

Year Published

2007
2007
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 199 publications
(78 citation statements)
references
References 42 publications
0
77
0
1
Order By: Relevance
“…Yeung et al [190] proposed some fuzzy-rough set models by means of arbitrary fuzzy relations and investigated the connections between the existing fuzzy-rough sets. Deng et al [191] investigated fuzzy relations by involving a fuzzy covering. Li and Ma [192] proposed two pairs of fuzzy-rough approximation operators, including fuzzy and crisp covering-based fuzzyrough approximation operators.…”
Section: Related Workmentioning
confidence: 99%
“…Yeung et al [190] proposed some fuzzy-rough set models by means of arbitrary fuzzy relations and investigated the connections between the existing fuzzy-rough sets. Deng et al [191] investigated fuzzy relations by involving a fuzzy covering. Li and Ma [192] proposed two pairs of fuzzy-rough approximation operators, including fuzzy and crisp covering-based fuzzyrough approximation operators.…”
Section: Related Workmentioning
confidence: 99%
“…In literature, different definitions of fuzzy coverings are proposed in [6,12]. However, we will use the following one, where the condition ∪C = U is maintained for infinite coverings.…”
Section: Definitionmentioning
confidence: 99%
“…Various extensions of rough sets, such as variable precision rough sets 54 , rough fuzzy set 3 , fuzzy rough sets 5,25,52 , etc., have been developed niques of data analysis and processing in machine learning 26 , pattern recognition 11,41 , and artificial intelligence 19 . Since redundant information usually covers a number of attributes or features in real world applications, which may confuse learning algorithms, cause distinguish slowdown in learning process and increase risk of learned classifiers to over-fit training data 50 , removing superfluous or irrelevant features is necessary in classification modeling.…”
Section: Introductionmentioning
confidence: 99%