2022
DOI: 10.3390/su142214695
|View full text |Cite
|
Sign up to set email alerts
|

On Disharmony in Batch Normalization and Dropout Methods for Early Categorization of Alzheimer’s Disease

Abstract: Alzheimer’s disease (AD) is a global health issue that predominantly affects older people. It affects one’s daily activities by modifying neural networks in the brain. AD is categorized by the death of neurons, the creation of amyloid plaques, and the development of neurofibrillary tangles. In clinical settings, an early diagnosis of AD is critical to limit the problems associated with it and can be accomplished using neuroimaging modalities, such as magnetic resonance imaging (MRI) and positron emission tomog… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
7
1
1

Relationship

3
6

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 71 publications
(59 reference statements)
0
3
0
Order By: Relevance
“…In the same way, we added the number of convolution layers with the filter size of 32, 64, 128, and 256 having the same kernel size of 3 × 3, stride size of 1 × 1, and padding is valid. Subsequently, we applied the global average pooling [ 49 ], flattened, dense [ 50 ] (in the dense layer, we used 512 neurons and kernel regularizing techniques L1 (10 −5 ) and L2 (10 −4 ), and dropout [ 51 ] layers with 0.5%. In the end, the softmax function [ 47 ] was utilized with the output layer to determine the likelihood score for each class and classify the decision label as to whether the input image contained a glioma, meningioma, or pituitary tumor.…”
Section: Methodsmentioning
confidence: 99%
“…In the same way, we added the number of convolution layers with the filter size of 32, 64, 128, and 256 having the same kernel size of 3 × 3, stride size of 1 × 1, and padding is valid. Subsequently, we applied the global average pooling [ 49 ], flattened, dense [ 50 ] (in the dense layer, we used 512 neurons and kernel regularizing techniques L1 (10 −5 ) and L2 (10 −4 ), and dropout [ 51 ] layers with 0.5%. In the end, the softmax function [ 47 ] was utilized with the output layer to determine the likelihood score for each class and classify the decision label as to whether the input image contained a glioma, meningioma, or pituitary tumor.…”
Section: Methodsmentioning
confidence: 99%
“…To mitigate the issue of overfitting, the dense layer was subjected to regulation using L1 (10 −5 ) and L2 (10 −4 ) regularization techniques [ 53 ]. During the training process, the neurons within a dropout layer [ 54 ] were randomly deactivated at a rate of 0.5% to enhance regularization implementation further. Finally, the output layer employed the softmax algorithm [ 51 ] to compute the probability score for each class and classify whether the input image exhibited a glioma, meningioma, pituitary, or no tumor.…”
Section: Methodsmentioning
confidence: 99%
“…It can also be represented using a Bayesian network which is very basic. Naive Bayes classifiers are particularly popular for classification and are a traditional solution to problems such as spam detection [54][55][56]. NB performance, with truth data, classifier results, and accuracy, is presented in Figure 9.…”
Section: Naive Bayesmentioning
confidence: 99%