2018
DOI: 10.1155/2018/4058403
|View full text |Cite
|
Sign up to set email alerts
|

SGB-ELM: An Advanced Stochastic Gradient Boosting-Based Ensemble Scheme for Extreme Learning Machine

Abstract: A novel ensemble scheme for extreme learning machine (ELM), named Stochastic Gradient Boosting-based Extreme Learning Machine (SGB-ELM), is proposed in this paper. Instead of incorporating the stochastic gradient boosting method into ELM ensemble procedure primitively, SGB-ELM constructs a sequence of weak ELMs where each individual ELM is trained additively by optimizing the regularized objective. Specifically, we design an objective function based on the boosting mechanism where a regularization item is intr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 25 publications
0
5
0
Order By: Relevance
“…Previous ML studies of biomarkers in PE have -in very small samples -obtained AUCs of 0.90 -0.96, including many more markers [51][52][53]. In this study we have not, except for maternal weight, included maternal risk factors [15], so it must be expected that a clinical performance will be better.…”
Section: Discussionmentioning
confidence: 92%
See 1 more Smart Citation
“…Previous ML studies of biomarkers in PE have -in very small samples -obtained AUCs of 0.90 -0.96, including many more markers [51][52][53]. In this study we have not, except for maternal weight, included maternal risk factors [15], so it must be expected that a clinical performance will be better.…”
Section: Discussionmentioning
confidence: 92%
“…The reason for this difference is explained by the ability for the QLattice model to capture the Gaussian interaction between Re and Lp. Previous ML studies of biomarkers in PE have – in very small samples – obtained AUCs of 0.90 – 0.96, including many more markers [51-53]. In this study we have not, except for maternal weight, included maternal risk factors [15], so it must be expected that a clinical performance will be better.…”
Section: Discussionmentioning
confidence: 95%
“…Extreme Gradient Boosting (XGBoost) and random forest (RF) algorithms were used to predict the probability of septic arthritis. The XGBoost algorithm is an extendible gradient boosting machine that can be applied to both regression and classification [ 6 ]. Since its introduction in 2014, the impact of XGBoost has been widely recognised in several machine learning and data-mining challenges.…”
Section: Methodsmentioning
confidence: 99%
“…For prediction of sarcopenia, the binary classification (sarcopenia/normal) model of XG Boost was used. The XGBoost model is one of the most commonly used machine learning models for solving both regression and classification problems and has been widely adopted to classify and predict medical events [3,45,46]. The synthetic minority oversampling technique (SMOTE) was utilised to overcome the potential bias arising from class imbalances, by creating synthetic minority class samples (the sarcopenia group in this case) [47,48].…”
Section: Machine Learning Model For Sarcopenia Predictionmentioning
confidence: 99%