2016
DOI: 10.1007/978-3-319-49586-6_25
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Multi-objective Swarm Crossover Optimization for Imbalanced Data Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…Li et al [28] presented a unified pre-processing method utilizing stochastic swarm heuristics to jointly optimize the mixtures from the two classes by gradually reconstructing the training dataset. Their method exhibited competitive performance in comparison with popular techniques.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Li et al [28] presented a unified pre-processing method utilizing stochastic swarm heuristics to jointly optimize the mixtures from the two classes by gradually reconstructing the training dataset. Their method exhibited competitive performance in comparison with popular techniques.…”
Section: Literature Reviewmentioning
confidence: 99%
“…4 illustrate this situation. Therefore, there is non-unique global best solution and the suitable solutions are all recorded in a solution set, it is called non-inferior set or Pareto optimal set [40,41]. This set contains the solutions with one of the following conditions: 1. the solution has better performances for achieving dual objectives; 2. the solution could improve one objective and the other one does not degrade outside the range (1.0e-4).…”
Section: Multi-objective Problem In Imbalance Classificationmentioning
confidence: 99%
“…This set contains the solutions with one of the following conditions: 1. the solution has better performances for achieving dual objectives; 2. the solution could improve one objective and the other one does not degrade outside the range (1.0e-4). The decision marking of our experiment is to select a solution which produces the best Kappa statistics and Accuracy as the final results from the non-inferior set [40].…”
Section: Multi-objective Problem In Imbalance Classificationmentioning
confidence: 99%