2019
DOI: 10.15587/1729-4061.2019.164789
|View full text |Cite
|
Sign up to set email alerts
|

Development of the modified methods to train a neural network to solve the task on recognition of road users

Abstract: Розроблено модифікації простого генетичного алго ритму для розпізнавання образів. У запропонованій модифікації АльфаБета на етапі відбору особин до нової популяції особини ранжуються за показником пристосо ваності, далі випадковим чином визначається кількість пар -певна кількість найпристосованіших особин, та стільки ж найменш пристосованих. Найпристосованіші особини формують підмножину B, найменш пристосова ні -підмножину W. Обидві підмножини входять в мно жину пар V. Число особин, що можуть бути обрані в пар… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0
1

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 25 publications
0
6
0
1
Order By: Relevance
“…Thus, it is proposed at each iteration of the method, after calculating the function of adaptability, to rank the individuals from the received generation based on the value of mutation resistance. In contrast to the classical operator, we indicate at the beginning not the likelihood of a mutation but the proportion of individuals who subjected to operator (25).…”
Section: Development Of a Methods For Constructing Neural Network Modementioning
confidence: 79%
See 3 more Smart Citations
“…Thus, it is proposed at each iteration of the method, after calculating the function of adaptability, to rank the individuals from the received generation based on the value of mutation resistance. In contrast to the classical operator, we indicate at the beginning not the likelihood of a mutation but the proportion of individuals who subjected to operator (25).…”
Section: Development Of a Methods For Constructing Neural Network Modementioning
confidence: 79%
“…The adam algorithm was used as an algorithm for optimization [21]. Adam is the optimization algorithm that can be used instead of a classical procedure to reduce a random gradient, to update the iterative weight of the network based on training data [22]. The algorithm combines the advantages of such classic gradient descent extensions as the adaptive gradient algorithm (AdaGrad) and the moving average of squared gradients (RMSProp).…”
Section: Development Of a Methods For Constructing Neural Network Modementioning
confidence: 99%
See 2 more Smart Citations
“…If this number is less than the mutation coefficient, then start the mutation procedure for this chromosome. This procedure is as follows [34]:…”
Section: Development Of An Algorithm For Determining Locations For Plmentioning
confidence: 99%