2023
DOI: 10.1109/tfuzz.2023.3270445
|View full text |Cite
|
Sign up to set email alerts
|

DG-ALETSK: A High-Dimensional Fuzzy Approach With Simultaneous Feature Selection and Rule Extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 39 publications
0
5
0
Order By: Relevance
“…The optimal non-concentric ring images in the geological borehole image recognition database, relative to the original images, reduce the proportion of irrelevant and interfering feature information [37] while preserving effective image feature information. It is beneficial for the classification process.…”
Section: ) Circular Hough Transformmentioning
confidence: 99%
“…The optimal non-concentric ring images in the geological borehole image recognition database, relative to the original images, reduce the proportion of irrelevant and interfering feature information [37] while preserving effective image feature information. It is beneficial for the classification process.…”
Section: ) Circular Hough Transformmentioning
confidence: 99%
“…For high-dimensional datasets, the number of features is large, especially for the last four datasets, and the feature numbers are all over 20. Considering that fuzzy rules are better at fitting data with small feature dimensions [14], the regression performance of the model is challenged. In addition, in high-dimensional datasets, some of them have small sample sizes, such as FOR and BAS, which increases the risk of overfitting.…”
Section: Datasetsmentioning
confidence: 99%
“…To alleviate "rule explosion", some examples from the literature [14,15] transfer input variables to a new feature space in advance by feature dimension reduction, such as the principal component analysis (PCA), and then carry out the structure and parameter identifications in this new space. For example, [15] restricts the maximum feature dimension to five and utilizes the PCA if the feature number exceeds five.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In recent years, knowledge distillation has emerged as a powerful technique for improving the performance of student networks in a wide range of tasks and domains, such as object detection [15,16], semantic segmentation [17][18][19], face recognition [20,21] and action recognition [22,23]. Its ability to transfer knowledge from a well-trained teacher network to a smaller student network has made it a popular choice for addressing challenges such as model compression, improving generalization and adapting models to resource-constrained environments.…”
Section: Introductionmentioning
confidence: 99%