2023
DOI: 10.1038/s41598-023-37540-z
|View full text |Cite
|
Sign up to set email alerts
|

A novel neural network model with distributed evolutionary approach for big data classification

Abstract: The considerable improvement of technology produced for various applications has resulted in a growth in data sizes, such as healthcare data, which is renowned for having a large number of variables and data samples. Artificial neural networks (ANN) have demonstrated adaptability and effectiveness in classification, regression, and function approximation tasks. ANN is used extensively in function approximation, prediction, and classification. Irrespective of the task, ANN learns from the data by adjusting the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 29 publications
0
4
0
Order By: Relevance
“…The Backpropagation algorithms (BPs) 3 or Extreme Learning Machines (ELM) 4 belong to the first group, commonly used for weight optimisation, while Evolutionary Algorithms (EAs) 5 belong to the second group. This second group is usually referred to Neuroevolution 6 or application of metaheuristics such as EAs to the evolution of ANNs, also known in the literature as Evolutionary Artificial Neural Networks (EANNs) 7 9 , so that both the weights and the ANN architecture are optimised. Using EAs is an efficient tool because finding a suitable ANN architecture is a controversial topic in Machine Learning (ML), requiring a lot of trial and error procedures and a great deal of experience of the researcher 2 .…”
Section: Introductionmentioning
confidence: 99%
“…The Backpropagation algorithms (BPs) 3 or Extreme Learning Machines (ELM) 4 belong to the first group, commonly used for weight optimisation, while Evolutionary Algorithms (EAs) 5 belong to the second group. This second group is usually referred to Neuroevolution 6 or application of metaheuristics such as EAs to the evolution of ANNs, also known in the literature as Evolutionary Artificial Neural Networks (EANNs) 7 9 , so that both the weights and the ANN architecture are optimised. Using EAs is an efficient tool because finding a suitable ANN architecture is a controversial topic in Machine Learning (ML), requiring a lot of trial and error procedures and a great deal of experience of the researcher 2 .…”
Section: Introductionmentioning
confidence: 99%
“…A large and diverse training dataset can lead to more accurate neural network models as it helps cover a broader range of input possibilities. Conversely, sufficient or low-quality training data can result in better performance and accurate predictions [59,60]. It is also essential that the training data is representative of the problem domain and the inputs that the neural network is expected to handle.…”
Section:  Impacts Of Training Datamentioning
confidence: 99%
“…Preprocessing the data this way can lead to more efficient neural networks by ensuring that the input data is consistent and that the neural network can better recognize relevant patterns and relationships. In summary, it is important to have a diverse, representative, and preprocessed dataset to ensure optimal performance [60].…”
Section:  the Quality And Quantity Of Training Datamentioning
confidence: 99%
“…This dynamic collaboration is weaving a tapestry of innovation, reshaping entire industries, and transforming user experiences. The journey through this terrain includes VM-based cloudlets 36 , supercharging efficiency in mobile devices, the development of programming frameworks for large-scale IoT applications 37 , a comprehensive exploration of mobile cloud computing 38 , the crucial role of fog computing in extending resources to the digital frontier, real-time handling of IoT data at the edge 39 , in-depth dives into the rich landscape of benefits and challenges within edge computing 40 , and a survey of multi-access edge computing within the architectural realm of 5G 41 .…”
mentioning
confidence: 99%