2020
DOI: 10.3390/ma13214952
|View full text |Cite
|
Sign up to set email alerts
|

Predicting the Tool Wear of a Drilling Process Using Novel Machine Learning XGBoost-SDA

Abstract: Tool wear negatively impacts the quality of workpieces produced by the drilling process. Accurate prediction of tool wear enables the operator to maintain the machine at the required level of performance. This research presents a novel hybrid machine learning approach for predicting the tool wear in a drilling process. The proposed approach is based on optimizing the extreme gradient boosting algorithm’s hyperparameters by a spiral dynamic optimization algorithm (XGBoost-SDA). Simulations were carried out on c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

1
30
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 41 publications
(32 citation statements)
references
References 34 publications
1
30
1
Order By: Relevance
“…XGB is the most popular ML algorithm, developed in 2015. Regardless of the data type, it is well known to provide better solutions than other ML algorithms, because of its rapidity, efficiency, and scalability [52,53]. It has been the focus of research in various fields [54][55][56].…”
Section: Extreme Gradient Boosting Regression (Xgb)mentioning
confidence: 99%
See 4 more Smart Citations
“…XGB is the most popular ML algorithm, developed in 2015. Regardless of the data type, it is well known to provide better solutions than other ML algorithms, because of its rapidity, efficiency, and scalability [52,53]. It has been the focus of research in various fields [54][55][56].…”
Section: Extreme Gradient Boosting Regression (Xgb)mentioning
confidence: 99%
“…It has been the focus of research in various fields [54][55][56]. In particular, in mechanical machining [52,57,58], XGB is a good choice to predict tool wear and surface roughness. XGB is used for supervising learning problems, where we use the training data (with multiple features) x i to predict a target variable y i .…”
Section: Extreme Gradient Boosting Regression (Xgb)mentioning
confidence: 99%
See 3 more Smart Citations