2022
DOI: 10.3390/rs14153765
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Double Ensemble Algorithm for the Classification of Multi-Class Imbalanced Hyperspectral Data

Abstract: The class imbalance problem has been reported to exist in remote sensing and hinders the classification performance of many machine learning algorithms. Several technologies, such as data sampling methods, feature selection-based methods, and ensemble-based methods, have been proposed to solve the class imbalance problem. However, these methods suffer from the loss of useful information or from artificial noise, or result in overfitting. A novel double ensemble algorithm is proposed to deal with the multi-clas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 56 publications
0
3
0
Order By: Relevance
“…It has the advantages of being simple, easy to implement, and having low computational cost, which makes it widely used in many applications. During the process of building a RF model, 24 it first employs the bagging method 25 to generate multiple different sub-sample sets from the original sample set. Then, the classification and regression tree algorithms are used to train the binary decision tree and construct the meta-classifier.…”
Section: Methodsmentioning
confidence: 99%
“…It has the advantages of being simple, easy to implement, and having low computational cost, which makes it widely used in many applications. During the process of building a RF model, 24 it first employs the bagging method 25 to generate multiple different sub-sample sets from the original sample set. Then, the classification and regression tree algorithms are used to train the binary decision tree and construct the meta-classifier.…”
Section: Methodsmentioning
confidence: 99%
“…Both f t and f t+1 obey the joint Gaussian distribution. When the expectation is zero, the joint distribution of f t and f t+1 is shown as (24).…”
Section: End Formentioning
confidence: 99%
“…To verify the performance effect of the FOG recognition algorithm proposed in this paper, the recognition algorithms based on Adaboost [22], Tomeklinks-Adaboost [23], RUSBoost and ROS-Adaboost [24] integration framework are respectively tested for the FOG recognition effect. To exclude the interference of hyperparameters, the learning rate is set to 0.1, the number of iterations is set to 30, the CART model is selected as the base classifier for all frameworks.…”
Section: Bayesian Optimization Experimentsmentioning
confidence: 99%