2019 9th International Conference on Advances in Computing and Communication (ICACC) 2019
DOI: 10.1109/icacc48162.2019.8986217
|View full text |Cite
|
Sign up to set email alerts
|

Estimation of Rainfall Quantity using Hybrid Ensemble Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…Their study found that due to the non-linear connections in rainfall datasets and ANN model's ability to learn from the past, they provide a superior solution than all existing technologies. Ganesh et al [3] created a composite ensemble regression model by combining bagging regression (BAR), extra tree regression (ETR), random forest regression (RFR), gradient boosting regression (GBR), and extreme gradient boosting regression (XGBoostR). Consequently, these ensemble regression models, in which two or more models are implemented in various combinations, are used to predict precipitation instead of features.…”
Section: Literature Reviewmentioning
confidence: 99%
See 3 more Smart Citations
“…Their study found that due to the non-linear connections in rainfall datasets and ANN model's ability to learn from the past, they provide a superior solution than all existing technologies. Ganesh et al [3] created a composite ensemble regression model by combining bagging regression (BAR), extra tree regression (ETR), random forest regression (RFR), gradient boosting regression (GBR), and extreme gradient boosting regression (XGBoostR). Consequently, these ensemble regression models, in which two or more models are implemented in various combinations, are used to predict precipitation instead of features.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Figure 13. Model stacking with ensemble layers GBR builds a weak ensemble prediction model similar to other boosting techniques [3]. Chen et al [24] proposed XGBoost based on the gradient boosting paradigm.…”
Section: Staking Of Modelmentioning
confidence: 99%
See 2 more Smart Citations