“…In a wrapper feature selection model, Mafarja et al [39] proposed two variants of the Whale Optimization Algorithm (WOA), called SWOA and VWOA. A dataset is minimized by removing redundant or irrelevant features in order to enhance the learning algorithms.…”
Section: Related Work Based On the Wrapper Approachmentioning
confidence: 99%
“…There have been several articles that have discussed combining the WOA method with other methods in order to reduce execution time, but these efforts have not been successful. We can point out some of the main negative points in Table 10, including the lack of scalability for large datasets ( [32], [44] and [53]), the risk of over tting ([48] and [49]), not considering the dependency of features ([46] and [52]) and the need for additional computing power ( [37], [39] and [51]).…”
Section: Nbmentioning
confidence: 99%
“…[36], [37], [38], [39], [40], [41], [42], [43], [44] Filter There is no expectation that every metaheuristic feature selection algorithm will work well for every problem. In this regard, it is important to report underperforming results as well.…”
A large number of features is the main problem in big data, leading to the curse of dimensionality. Meanwhile, feature selection is suggested as a solution. The process of feature selection consists of adding relevant features to a neural model and eliminating irrelevant or redundant ones. The feature selection community has recently been drawn to swarm intelligence techniques due to their simplicity and potential global search capabilities. A straightforward overview of the newest research in the feature selection field is provided here using a nature-inspired metaheuristic method called Whale Optimization Algorithm (WOA). Research is expected to be presented in terms of various types of state-of-the-art methods and their advantages and disadvantages, encouraging researchers to investigate more advanced approaches. A discussion of possible limitations and issues for future research is included as well as guidance for practitioners on selecting appropriate methods for real-world situations.
“…In a wrapper feature selection model, Mafarja et al [39] proposed two variants of the Whale Optimization Algorithm (WOA), called SWOA and VWOA. A dataset is minimized by removing redundant or irrelevant features in order to enhance the learning algorithms.…”
Section: Related Work Based On the Wrapper Approachmentioning
confidence: 99%
“…There have been several articles that have discussed combining the WOA method with other methods in order to reduce execution time, but these efforts have not been successful. We can point out some of the main negative points in Table 10, including the lack of scalability for large datasets ( [32], [44] and [53]), the risk of over tting ([48] and [49]), not considering the dependency of features ([46] and [52]) and the need for additional computing power ( [37], [39] and [51]).…”
Section: Nbmentioning
confidence: 99%
“…[36], [37], [38], [39], [40], [41], [42], [43], [44] Filter There is no expectation that every metaheuristic feature selection algorithm will work well for every problem. In this regard, it is important to report underperforming results as well.…”
A large number of features is the main problem in big data, leading to the curse of dimensionality. Meanwhile, feature selection is suggested as a solution. The process of feature selection consists of adding relevant features to a neural model and eliminating irrelevant or redundant ones. The feature selection community has recently been drawn to swarm intelligence techniques due to their simplicity and potential global search capabilities. A straightforward overview of the newest research in the feature selection field is provided here using a nature-inspired metaheuristic method called Whale Optimization Algorithm (WOA). Research is expected to be presented in terms of various types of state-of-the-art methods and their advantages and disadvantages, encouraging researchers to investigate more advanced approaches. A discussion of possible limitations and issues for future research is included as well as guidance for practitioners on selecting appropriate methods for real-world situations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.