“…As for Random Forest, its hyperparameters had "n_estimators: range (10, 300, 10)", "min_samples_split: range (5, 50, 5)", "min_samples_leaf: range (2, 40, 2)", "max_depth: range (1, 30, 2)", "criterion: ’gini’, ’entropy’ ", "class_weight: None, ’balanced’ ". As for XGBoost, "max_depth: [ 3 , 5 , 7 ] ", "learning_rate: [0.1, 0.01, 0.001] ", "subsample: [0.1, 0.01, 0.001] ", "colsample_bytree: [0.5, 0.7, 1] ", "gamma: [0, 0.1, 0.2, 0.3, 0.4] ", "reg_alpha: [0, 0.001, 0.005, 0.01, 0.05] ", "reg_lambda: [0, 0.001, 0.005, 0.01, 0.05] ". The hyperparameters of LightGBM was same as XGBoost except "gamma".…”