2023
DOI: 10.3390/e25050703
|View full text |Cite
|
Sign up to set email alerts
|

Study on the Use of Artificially Generated Objects in the Process of Training MLP Neural Networks Based on Dispersed Data

Abstract: This study concerns dispersed data stored in independent local tables with different sets of attributes. The paper proposes a new method for training a single neural network—a multilayer perceptron based on dispersed data. The idea is to train local models that have identical structures based on local tables; however, due to different sets of conditional attributes present in local tables, it is necessary to generate some artificial objects to train local models. The paper presents a study on the use of varyin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 17 publications
(20 reference statements)
0
0
0
Order By: Relevance
“…Furthermore, if a node has more than one input, the final value of the function of this node corresponds to the sum of the individual values of the functions of the node and their connections. The end of each step performed in the iterative training process is terminated by backward propagation, where the error is propagated to the input layers, and updating the values of the weights [65,66]. As in MLs, learning is based on minimizing a cost function J.…”
Section: Multilayer Perceptronmentioning
confidence: 99%
“…Furthermore, if a node has more than one input, the final value of the function of this node corresponds to the sum of the individual values of the functions of the node and their connections. The end of each step performed in the iterative training process is terminated by backward propagation, where the error is propagated to the input layers, and updating the values of the weights [65,66]. As in MLs, learning is based on minimizing a cost function J.…”
Section: Multilayer Perceptronmentioning
confidence: 99%