2022
DOI: 10.1155/2022/4703682
|View full text |Cite
|
Sign up to set email alerts
|

Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network

Abstract: One of the leading algorithms and architectures in deep learning is Convolution Neural Network (CNN). It represents a unique method for image processing, object detection, and classification. CNN has shown to be an efficient approach in the machine learning and computer vision fields. CNN is composed of several filters accompanied by nonlinear functions and pooling layers. It enforces limitations on the weights and interconnections of the neural network to create a good structure for processing spatial and tem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 27 publications
(9 citation statements)
references
References 70 publications
0
9
0
Order By: Relevance
“…However, the jump to three rooms worth of data results in a significant jump in accuracy, getting much closer to the acceptable levels. The main steps of this work have been revised due to these references (Ranjbarzadeh et al, 2021;Anari et al, 2022;Ranjbarzadeh et al, 2022;Saadi et al, 2022;Ranjbarzadeh et al, 2023a;Ranjbarzadeh et al, 2023b).…”
Section: Resultsmentioning
confidence: 99%
“…However, the jump to three rooms worth of data results in a significant jump in accuracy, getting much closer to the acceptable levels. The main steps of this work have been revised due to these references (Ranjbarzadeh et al, 2021;Anari et al, 2022;Ranjbarzadeh et al, 2022;Saadi et al, 2022;Ranjbarzadeh et al, 2023a;Ranjbarzadeh et al, 2023b).…”
Section: Resultsmentioning
confidence: 99%
“…In the context of data mining and analysis, deep learning models (DLs) are currently paving new avenues for POI recommendation systems. Unlike the earlier shallow neural network approaches, such as artificial neural networks (ANNs) that have been under exploration for many years, DL-based structures are characterized using a considerably amplified number of continuously linked neural layers [ 48 , 49 , 50 , 51 ]. This amplified number of layers is able to mine hidden patterns and higher-level features and can discover more difficult and hierarchical relationships.…”
Section: Methodsmentioning
confidence: 99%
“…In order to explore the influence of E 1 , E 2 , E 3 , and E 4 on the energy function E, the score functions Φ 1 , Φ 2 , Φ 3 , and Φ 4 are defined. The DMuRPss without containing entity descriptions is used as the baseline, and the DMuRPss score function is defined as Φ 1 , Equation (12). It is worth noting that Φ 1 is also the definition of the MuRP [18] score function.…”
Section: Dmurp Methodsmentioning
confidence: 99%
“…For translation models and bilinear models, the models are relatively simple, and it is difficult to fully explore the relationship between triple entities. Neural network models and rotation models often need to spend more memory space to express the embeddings of entities to obtain more semantic information [12][13][14]. In order to reduce the demand for memory space, in recent years, hyperbolic models have gradually attracted attention in hyperbolic space [15,16].…”
Section: Introductionmentioning
confidence: 99%