2019
DOI: 10.1186/s40537-019-0267-3
|View full text |Cite
|
Sign up to set email alerts
|

Online Feature Selection (OFS) with Accelerated Bat Algorithm (ABA) and Ensemble Incremental Deep Multiple Layer Perceptron (EIDMLP) for big data streams

Abstract: Feature selection is mainly used to lessen the dispensation load of data mining models. To condense the time for processing voluminous data, parallel processing is carried out with MapReduce (MR) technique. However with the existing algorithms, the performance of the classifiers needs substantial improvement. MR method, which is recommended in this research work, will perform feature selection in parallel which progresses the performance. To enhance the efficacy of the classifier, this research work proposes a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 34 publications
(36 reference statements)
0
6
0
Order By: Relevance
“…However, the preprocessing operation is poor. The EIDMLP introduced in Reference 32 is good in classifying the features with higher Precision and F ‐measure. Since the computational complexity of this technique is lower, it is not preferred.…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the preprocessing operation is poor. The EIDMLP introduced in Reference 32 is good in classifying the features with higher Precision and F ‐measure. Since the computational complexity of this technique is lower, it is not preferred.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In 2019, Devi et al 32 have developed a new framework referred as online feature selection–accelerated bat algorithm (OFS–ABA) in the feature space. With the map reduce framework, the authors have selected the nonsuperfluous and significant features in the OFS‐ABA method.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The BA in the domain of feature selection has shown outstanding results and proved its scalability in addressing a such complex and high-dimensional problems such as gene selection [ 7 , 11 , 22 , 55 , 66 ], disease detection [ 64 , 65 , 135 ], cancer classification [ 143 ], fault diagnosis [ 105 ], and online feature selection [ 247 ]. The BA applications in this domain are presented in Fig.…”
Section: Applications Of Bat-inspired Algorithmmentioning
confidence: 99%
“…In this fitness function, a combination of classification accuracy in the KNN classification algorithm and the sum of similarities between the selected features is used. The fit of the feature subset in the iteration denoted by ( ( )) is measured by Equation (5).…”
Section: Step 4: Calculate Fitness Valuesmentioning
confidence: 99%
“…Whenever the needed number of training examples cannot be provided, reducing features decreases the size of the needed training examples and hence increases the overall yield shape of the classification algorithm. In the previous years, two methods for dimensional reduction were presented: feature selection and feature extraction [4,5]. Feature selection (FS) seeks for a relevant subset of existing features, while features are designed for a new space of lower dimensionality in the feature extraction method.…”
Section: Introductionmentioning
confidence: 99%