2023
DOI: 10.1109/tfuzz.2022.3214241
|View full text |Cite
|
Sign up to set email alerts
|

Multiobjective Evolutionary Optimization for Prototype-Based Fuzzy Classifiers

Abstract: Evolving intelligent systems (EISs), particularly, the zero-order ones have demonstrated strong performance on many real-world problems concerning data stream classification, while offering high model transparency and interpretability thanks to their prototype-based nature. Zero-order EISs typically learn prototypes by clustering streaming data online in a "one pass" manner for greater computation efficiency. However, such identified prototypes often lack optimality, resulting in less precise classification bo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 70 publications
0
2
0
Order By: Relevance
“…3) SOTA Methods for Comparison: In this study, the following 17 single-model SOTA algorithms are used for performance comparison: 1) SVM [67]; 2) KNN [68]; 3) sequential classifier (SEQ) [69]; 4) sequence-dictionary-based KNN classifier (SDKNN) [69]; 5) extreme learning machine (ELM) [70]; 6) MLP; 7) LSTM [71]; 8) probabilistic neural network (PNN) [72]; 9) eigenvalue classifier (EIG) [73]; 10) spherical approximation classifier (SPA) [74]; 11) self-adaptive fuzzy learning system (SALF) [26]; 12) multi-objective optimised self-organising fuzzy inference system (MOOSOFIS) [75]; 13) SEFIS [38]; 14) ESAFIS [31]; 15) PALM [44]; 16) eClass0 classifier [24], and; 17) eClass1 classifier [24].…”
Section: Experimental Investigation In This Section Numerical Example...mentioning
confidence: 99%
See 2 more Smart Citations
“…3) SOTA Methods for Comparison: In this study, the following 17 single-model SOTA algorithms are used for performance comparison: 1) SVM [67]; 2) KNN [68]; 3) sequential classifier (SEQ) [69]; 4) sequence-dictionary-based KNN classifier (SDKNN) [69]; 5) extreme learning machine (ELM) [70]; 6) MLP; 7) LSTM [71]; 8) probabilistic neural network (PNN) [72]; 9) eigenvalue classifier (EIG) [73]; 10) spherical approximation classifier (SPA) [74]; 11) self-adaptive fuzzy learning system (SALF) [26]; 12) multi-objective optimised self-organising fuzzy inference system (MOOSOFIS) [75]; 13) SEFIS [38]; 14) ESAFIS [31]; 15) PALM [44]; 16) eClass0 classifier [24], and; 17) eClass1 classifier [24].…”
Section: Experimental Investigation In This Section Numerical Example...mentioning
confidence: 99%
“…In running the experiments, for OD, PR and IS datasets, their original training-testing splits are used. For the other seven datasets, 50% of data samples are randomly selected to construct the training sets and the remaining 50% are used as the validation sets [75]. The detailed classification results in terms of accuracy (Acc) and balanced accuracy (BAcc) are given in Supplementary Tables S9 and S10, respectively.…”
Section: B Performance Demonstration On Numerical Classification Prob...mentioning
confidence: 99%
See 1 more Smart Citation
“…Comprising two identical neural networks with shared parameters, siamese networks generate a similarity score through a distance metric function. They have proven successful in tasks such as image matching and face recognition (Gu et al, 2022b ; Li M. et al, 2022 ). Distance metric learning is a key aspect of metric learning, aiming to learn a function that can measure the distance between samples.…”
Section: Related Workmentioning
confidence: 99%
“…Distance metric learning is a key aspect of metric learning, aiming to learn a function that can measure the distance between samples. Various methods exist for distance metric learning, including prototype-based methods (Gu et al, 2022b ), metric matrix-based methods (Price et al, 2022 ), and maximum margin-based methods (Li X. et al, 2022 ). Among these, Max-Margin Metric Learning (MMML) has emerged as a classic technique maximizing the distances between different classes while minimizing the distances within the same class.…”
Section: Related Workmentioning
confidence: 99%