2020 Fifth International Conference on Research in Computational Intelligence and Communication Networks (ICRCICN) 2020
DOI: 10.1109/icrcicn50933.2020.9296176
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of Gradient Boosting and Extreme Boosting Ensemble Methods for Webpage Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 8 publications
0
5
0
Order By: Relevance
“…The initial data set may have abnormalities or inaccurate values that impair the quality of the dataset. [12] Data pre-processing involves various steps such as Data cleaning, Encoding Categorical values, feature selection, Data normalization or standardization and Training and Testing.…”
Section: Data Pre-processing and Experimental Setupmentioning
confidence: 99%
“…The initial data set may have abnormalities or inaccurate values that impair the quality of the dataset. [12] Data pre-processing involves various steps such as Data cleaning, Encoding Categorical values, feature selection, Data normalization or standardization and Training and Testing.…”
Section: Data Pre-processing and Experimental Setupmentioning
confidence: 99%
“…3. The proposed algorithm is compared with logistic regression (LR) [54], support vector classifier (SVC) [54], random forest (RF) classifier [55], Ada boost classifier [55], K-nearest neighbor (KNN) [56], extreme gradient boosting (XGB) [57], extra trees classifier [58], gradient boosting classifier [59], and Gaussian NB [60]. These classifiers are conducted based on two and three classes for measuring the performance in each one.…”
Section: Proposed Algorithms For Multi-weight Polarity Sentimentsmentioning
confidence: 99%
“…It involves training the models on the preprocessed healthcare data to learn patterns, relationships, and predictive features associated with different diseases. Various machine learning algorithms, such as XGBoost (eXtreme Gradient Boosting), can be employed for this purpose [5]. The training process involves splitting the dataset into training and validation subsets to assess model performance and prevent overfitting.…”
Section: Model Trainingmentioning
confidence: 99%
“…XGBoost is a powerful ensemble learning algorithm that is used for both classification and regression tasks [6]. It uses a gradient boosting framework to iteratively train multiple weak models (decision trees) on the data, where each subsequent model tries to correct the errors of the previous models [5].…”
Section: Xgboostmentioning
confidence: 99%