2021
DOI: 10.1080/0952813x.2021.1938698
|View full text |Cite
|
Sign up to set email alerts
|

An intelligent ensemble classification method based on multi-layer perceptron neural network and evolutionary algorithms for breast cancer diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 32 publications
(13 citation statements)
references
References 47 publications
0
13
0
Order By: Relevance
“…The MLP classifier, as implied by its name, is connected to a Neural Network. When it comes to performing the job of classification, MLP Classifier depends on an underlying Neural Network [41]. Suppose there is N number of hidden layers present, and each hidden layer is associated with η number of neurons.…”
Section: F Multi Layer Perceptron (Mlp)mentioning
confidence: 99%
“…The MLP classifier, as implied by its name, is connected to a Neural Network. When it comes to performing the job of classification, MLP Classifier depends on an underlying Neural Network [41]. Suppose there is N number of hidden layers present, and each hidden layer is associated with η number of neurons.…”
Section: F Multi Layer Perceptron (Mlp)mentioning
confidence: 99%
“…Modeling the policy of attracting the imperialists : The policy of attracting the imperialists means that the colonial countries must move toward their imperialists. This section is based on the differential evolution (DE) technique according to 38 …”
Section: Proposed Approachmentioning
confidence: 99%
“…On this basis, the metaheuristic algorithm is utilized to traverse the entire space of the model to obtain the optimal solution of the model. It has been successfully applied to many tasks, such as neuroevolutionary algorithms for optimizing deep reinforcement learning models [20], genetic algorithms (GA) for optimizing BP neural networks [21], and evolutionary algorithms (EA) for optimizing the hyperparameters of neural networks [22]. In a recent prediction study, Xie Hailun et al proposed a prediction model based on an improved gray wolf optimization (GWO) algorithm optimizing CNN-LSTM, which enhances the optimization finding ability of the GWO algorithm by introducing a different searching mechanism, and the optimized CNN-LSTM network provides a better characterization ability, which is not only capable of capturing the interactions of the important features but also capable of encapsulating the complex temporal complex dependencies in the background for time series tasks [23].…”
Section: Introduction 1literature Reviewmentioning
confidence: 99%