2020
DOI: 10.1016/j.apor.2020.102339
|View full text |Cite
|
Sign up to set email alerts
|

Using Random forest and Gradient boosting trees to improve wave forecast at a specific location

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 76 publications
(19 citation statements)
references
References 23 publications
1
18
0
Order By: Relevance
“…Gradient boosting decision tree (GBDT) and its effective implementations such as XGBoost [3] and LightGBM [4] are widely used machine learning both in industry and academic applications [5,11,12]. In distributed GBDT, the training data is located in different machines and should be partitioned according to the sample level.…”
Section: Federated Gradient Boosting Decision Treementioning
confidence: 99%
See 1 more Smart Citation
“…Gradient boosting decision tree (GBDT) and its effective implementations such as XGBoost [3] and LightGBM [4] are widely used machine learning both in industry and academic applications [5,11,12]. In distributed GBDT, the training data is located in different machines and should be partitioned according to the sample level.…”
Section: Federated Gradient Boosting Decision Treementioning
confidence: 99%
“…In this context, it is necessary to consider factors such as privacy protection, unbalanced/skewed data distribution, fairness, to form a closed-loop federated learning system (FLS) [2]. On the other hand, gradient boosting decision trees (GBDTs) has become a popular machine learning algorithm and has shined in many machine learning and data mining competitions [3,4] as well as real-world applications for its salient results on classification, ranking, prediction, etc., (especially for tabular data mining task) [5]. And several works have studied the horizontal federated GBDT system [6,7].…”
Section: Introductionmentioning
confidence: 99%
“…The RF is train with three sets of inputs: the electricity generated, the wave energy flux at the nearest gridpoint, and ocean and atmospheric data. In [163], RFs and Gradient boosting trees are used to improve significant wave height forecasting. Hyperparameter values of these ML algorithms are tuned using bayesian optimization.…”
Section: Rf In Marine and Ocean Energymentioning
confidence: 99%
“…In this paper, we aim to present the novel approach to video segmentation by using the optimization of segmentation parameters based on Grid-based ensemble method [62]. Our approach is the combination of random forest and gradient boosting decision tree [63], [64], [65], [66], [67], [68], [69], [70]. The system framework of our approach will be explained in section 2.…”
Section: Introductionmentioning
confidence: 99%