2022
DOI: 10.1007/s10489-022-03585-2
|View full text |Cite
|
Sign up to set email alerts
|

Majority-to-minority resampling for boosting-based classification under imbalanced data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 67 publications
0
2
0
Order By: Relevance
“…This is because classifiers can become biased towards the majority group, potentially overlooking or inaccurately classifying instances from the less-represented category. Ensuring accurate classification in these circumstances is crucial to avoiding misleading results [20], [21]. The characteristics of the dataset can exacerbate the issue of having insufficient training observations, which in turn can lead to overfitting.…”
Section: Crowd Scene Classification Using Fully Connected Deep Neural...mentioning
confidence: 99%
“…This is because classifiers can become biased towards the majority group, potentially overlooking or inaccurately classifying instances from the less-represented category. Ensuring accurate classification in these circumstances is crucial to avoiding misleading results [20], [21]. The characteristics of the dataset can exacerbate the issue of having insufficient training observations, which in turn can lead to overfitting.…”
Section: Crowd Scene Classification Using Fully Connected Deep Neural...mentioning
confidence: 99%
“…Wang et al [26] introduced a hybrid strategy called Majority-to-Minority Resampling (MMR) and a boosting algorithm called Majority-to-Minority Boosting (MMBoost) for classifcation tasks. MMR was developed to tackle class imbalance by taking samples from the majority class to augment the minority class.…”
Section: Related Workmentioning
confidence: 99%
“…Where 𝑂 𝑗 is the output value in unit j and 𝐼 𝑗 is the input value in j-unit. Next, calculate the error value used as a stop condition using the MSE calculation formula as in (9).…”
Section: Calculate the Information Gain Ratio For Each Feature I In Fmentioning
confidence: 99%
“…Related research on using the random oversampling and SMOOTE technique outperformed the other resampling [8]. Majorityto-Minority Resampling (MMR), a hybrid approach to pick switched instances, adaptively selects potential instances from the majority class to enhance the minority class, showing that the result of the proposed approach outperforms several strong baselines across standard metrics for imbalanced data [9]. Similarity Oversampling and Undersampling Preprocessing (SOUP), which resamples tough cases, outperforms specialist preprocessing methods for multi-imbalanced issues and competes with the most famous decomposition ensembles on natural and artificial datasets [10].…”
mentioning
confidence: 99%