2022
DOI: 10.1016/j.atmosres.2022.106238
|View full text |Cite
|
Sign up to set email alerts
|

Application of XGBoost algorithm in the optimization of pollutant concentration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 61 publications
(17 citation statements)
references
References 29 publications
0
15
0
Order By: Relevance
“…XGBoost is an optimized distributed gradient boosting library designed to be efficient, flexible and portable ( Li et al., 2022 ). First, it constructs an appropriate number of weak learners, mainly classification regression trees, to train weak learners.…”
Section: Methodsmentioning
confidence: 99%
“…XGBoost is an optimized distributed gradient boosting library designed to be efficient, flexible and portable ( Li et al., 2022 ). First, it constructs an appropriate number of weak learners, mainly classification regression trees, to train weak learners.…”
Section: Methodsmentioning
confidence: 99%
“…Equation 10 for the combinatorial optimization task. The classification layer contains optimized instances of an XGBoost [40] and a Decision Tree [41]. The optimization layer contains instances of a parameterized Genetic Algorithm [35] and combinatorial PSO [42].…”
Section: B Proof Of Concept 2 -Compound Dssmentioning
confidence: 99%
“…The theoretical basis of XGBoost is as follows. Suppose the model has k decision trees, the integrated model can be expressed mathematically as [38]:…”
Section: Extreme Gradient Boosting (Xgboost)mentioning
confidence: 99%
“…By adding a new tree f k to fit the residual error between the predicted value of the previous tree and the actual value, a new model is formed, and the new model is used as the basis for the next model learning. Mathematically, this can be stated as follows [38].…”
Section: Extreme Gradient Boosting (Xgboost)mentioning
confidence: 99%
See 1 more Smart Citation