2022
DOI: 10.3390/computers11090136
|View full text |Cite
|
Sign up to set email alerts
|

Predicting Breast Cancer from Risk Factors Using SVM and Extra-Trees-Based Feature Selection Method

Abstract: Developing a prediction model from risk factors can provide an efficient method to recognize breast cancer. Machine learning (ML) algorithms have been applied to increase the efficiency of diagnosis at the early stage. This paper studies a support vector machine (SVM) combined with an extremely randomized trees classifier (extra-trees) to provide a diagnosis of breast cancer at the early stage based on risk factors. The extra-trees classifier was used to remove irrelevant features, while SVM was utilized to di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 61 publications
(25 citation statements)
references
References 33 publications
1
16
0
Order By: Relevance
“…The extra trees algorithm is an ensemble learning method in which each decision tree is constructed from the raw training dataset. Each tree randomly selects k features, each feature randomly selects a split node, and then a score for each split node is calculated based on some mathematical metrics (e.g., the Gini index); the node with the highest score is selected as the final split node [ 19 ]. This random feature selection makes the randomness of each sub model greater, which suppresses the overfitting of the whole model.…”
Section: Methodsmentioning
confidence: 99%
“…The extra trees algorithm is an ensemble learning method in which each decision tree is constructed from the raw training dataset. Each tree randomly selects k features, each feature randomly selects a split node, and then a score for each split node is calculated based on some mathematical metrics (e.g., the Gini index); the node with the highest score is selected as the final split node [ 19 ]. This random feature selection makes the randomness of each sub model greater, which suppresses the overfitting of the whole model.…”
Section: Methodsmentioning
confidence: 99%
“…It takes input and creates multiple trees out of them to separate and select the best attributes according to the execution of the voting algorithm. This mechanism creates decision trees by utilizing the information provided with the input data set and determining an optimum position to divide multiple nodes from each other [35].…”
Section: Extra Tree Classifier (Etc)mentioning
confidence: 99%
“…Artificial intelligence (AI) and machine learning (ML)-based diagnostics have been widely utilized by researchers and practitioners to detect risks and facilitate decision-making in a range of contexts, from the prediction of chronic diseases like cancer [70,71] to psychological disorders [72]. Therefore, this ML-based deep learning model yield in this experiment can be further implemented in web/mobile-based applications for practical usage and gives maximum benefits of early glaucoma detection to the worldwide community.…”
Section: Practical Applicationmentioning
confidence: 99%