2022
DOI: 10.3390/s22124426
|View full text |Cite
|
Sign up to set email alerts
|

Performance Analysis of State-of-the-Art CNN Architectures for LUNA16

Abstract: The convolutional neural network (CNN) has become a powerful tool in machine learning (ML) that is used to solve complex problems such as image recognition, natural language processing, and video analysis. Notably, the idea of exploring convolutional neural network architecture has gained substantial attention as well as popularity. This study focuses on the intrinsic various CNN architectures: LeNet, AlexNet, VGG16, ResNet-50, and Inception-V1, which have been scrutinized and compared with each other for the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 42 publications
(17 citation statements)
references
References 51 publications
0
14
0
Order By: Relevance
“…Where η: Initial learning rate; q t : Exponential average of gradient along w j p t : Gradient at time t along with w j s t : Exponential average of squares of gradient along w j γ 1 and γ 2 : Hyperparameter SGD (Stochastic gradient descent) is the third optimiser that we chose for performance evaluation, this optimiser updates the parameter one at a time. 3 Therefore, it is much faster than other optimisers and cost function minimised after each iteration. It requires less memory consumption and Due to its capability of frequently updating model parameter cost function can be fluctuating heavily, that why gradient jump to the global minimum point.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Where η: Initial learning rate; q t : Exponential average of gradient along w j p t : Gradient at time t along with w j s t : Exponential average of squares of gradient along w j γ 1 and γ 2 : Hyperparameter SGD (Stochastic gradient descent) is the third optimiser that we chose for performance evaluation, this optimiser updates the parameter one at a time. 3 Therefore, it is much faster than other optimisers and cost function minimised after each iteration. It requires less memory consumption and Due to its capability of frequently updating model parameter cost function can be fluctuating heavily, that why gradient jump to the global minimum point.…”
Section: Methodsmentioning
confidence: 99%
“…2 The domains of image analysis and computer vision are particularly well-suited for the application of deep learning to real-world situations. 3 When it comes to image classification and detection issues, the convolutional neural network (CNN) stands out as a top deep learning model, 4 due to its ability to automatically extract features from images and accurately classify or segment them. Medical imaging is currently the best method for identifying and treating diseases in their earliest stages.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The "■" feature appears in the upper-left corner, the lower-right corner, and the middle of the image. After calculations are performed for the convolution layer and pooling layer,  is similarly derived, meaning that the feature can be detected wherever it appears [24].…”
Section: Figure 3 Image Of Covered Linesmentioning
confidence: 99%
“…This is calculated for both x and ŷ. [69]. Another adaptive learning technique for addressing damaging learning rates is the root mean square propagation.…”
Section: Introductionmentioning
confidence: 99%