Computational Intelligence for Multimedia Big Data on the Cloud With Engineering Applications 2018
DOI: 10.1016/b978-0-12-813314-9.00002-5
|View full text |Cite
|
Sign up to set email alerts
|

Computational Intelligence in Smart Grid Environment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 59 publications
0
10
0
Order By: Relevance
“…Data was used as 5-fold cross-validation. The input features were based on participants' responses to all survey's questions, except questions 10,11,12,16,17,18,20, and 21 (Supplementary Material); these omitted questions were about self-beliefs and emotions, and thus they were excluded to prevent bias in the prediction process. The output was the severity of the post-vaccination side effects (i.e., no, mild, moderate, and severe side effects).…”
Section: Machine Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…Data was used as 5-fold cross-validation. The input features were based on participants' responses to all survey's questions, except questions 10,11,12,16,17,18,20, and 21 (Supplementary Material); these omitted questions were about self-beliefs and emotions, and thus they were excluded to prevent bias in the prediction process. The output was the severity of the post-vaccination side effects (i.e., no, mild, moderate, and severe side effects).…”
Section: Machine Learningmentioning
confidence: 99%
“…RF is based on an ensemble of decision trees (DTs). Each tree predicts a classification independently and "votes" for the related class, and most of the votes decide the overall RF predictions [18]. RF learner node within KNIME Analytics Platform was implemented with the following settings: splitting criterion is the information gain ratio, which normalizes the standard information gain by the split entropy to overcome any unfair preference for nominal splits with many child nodes, while the number of trees is 100.…”
Section: Random Forest (Rf)mentioning
confidence: 99%
See 1 more Smart Citation
“…RF is a versatile ML approach [ 34 – 36 ], which is based on ensemble of decision trees (DTs), with each tree independently predicting a classification and "voting" for the related class, and the majority of the votes deciding the overall RF predictions [ 42 ]. Within the KNIME Analytics Platform, we constructed an RF learner node with the following settings: splitting criterion is the information gain ratio and number of trees (= 100).…”
Section: Methodsmentioning
confidence: 99%
“…XGBoost employs an ensemble of weak DT-type models to generate boosted, DT-type models. This system incorporates an unique tree learning algorithm as well as a theoretically justified weighted quantile sketch technique with parallel and distributed computation [ 41 , 42 , 93 ]. We constructed the XGBoost learner node within the KNIME Platform as follows: tree booster was used with depth wise grow policy, boosting rounds = 100, Eta = 0.3, Gamma = 0, maximum depth = 6, minimum child weight = 1, maximum delta step = 0, sub-sampling rate = 1, column sampling rate by tree = 1, column sampling rate by level = 1, lambda = 1, Alpha = 0, sketch epsilon = 0.03, scaled position weight = 1, maximum number of bins = 256, sample type (uniform), normalize type (tree), and dropout rate = 0.…”
Section: Methodsmentioning
confidence: 99%