2021 6th International Conference on Communication and Electronics Systems (ICCES) 2021
DOI: 10.1109/icces51350.2021.9489012
|View full text |Cite
|
Sign up to set email alerts
|

Classification of benign-malignant pulmonary lung nodules using ensemble learning classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 21 publications
0
1
0
Order By: Relevance
“…In contrast to deep neural networks (CNNs) which use 2D layers, ANNs refer to multiple 1D layers of neurons stacked on top of each other for classification of features. They are also commonly known as multi-layer perceptrons, or feed-forward neural networks, and have been successfully used in medical image classification such as classifying CT scans containing lung nodules [30] and skin lesion malignancy [31]. For the current study, the ANN was implemented in Python’s Keras library, using an input layer with the same length as each row of features (560), 5 hidden Dense layers of 550 neurons each, and a Dense-3 layer with softmax activation at the output, as illustrated in Fig.…”
Section: Methodsmentioning
confidence: 99%
“…In contrast to deep neural networks (CNNs) which use 2D layers, ANNs refer to multiple 1D layers of neurons stacked on top of each other for classification of features. They are also commonly known as multi-layer perceptrons, or feed-forward neural networks, and have been successfully used in medical image classification such as classifying CT scans containing lung nodules [30] and skin lesion malignancy [31]. For the current study, the ANN was implemented in Python’s Keras library, using an input layer with the same length as each row of features (560), 5 hidden Dense layers of 550 neurons each, and a Dense-3 layer with softmax activation at the output, as illustrated in Fig.…”
Section: Methodsmentioning
confidence: 99%