Proceedings 2001 IEEE International Conference on Data Mining
DOI: 10.1109/icdm.2001.989527
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating boosting algorithms to classify rare classes: comparison and improvements

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
121
0
2

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 189 publications
(123 citation statements)
references
References 6 publications
0
121
0
2
Order By: Relevance
“…proposed that the AdaCost algorithm is one of the representative methods of cost-sensitive learning. Joshi et al [7]. showed that the AdaCost algorithm has high accuracy and recall rate for a few class samples.…”
Section: Related Work and Problemsmentioning
confidence: 99%
“…proposed that the AdaCost algorithm is one of the representative methods of cost-sensitive learning. Joshi et al [7]. showed that the AdaCost algorithm has high accuracy and recall rate for a few class samples.…”
Section: Related Work and Problemsmentioning
confidence: 99%
“…Several boosting algorithms have been developed since the preliminary work by Schapire 31 , which include cost-sensitive versions 32,33 and those which can deliver confidence approximations in their forecasts 34 . The most widely used variant of boosting methods is AdaBoost 35 .…”
Section: Constructingmentioning
confidence: 99%
“…Therefore, several common performance metrics were applied, such as Recall (1), Precision (2), FP-rate (3), and an F-score (4) [6], [7]. Also, a Receiver Operating Characteristics (ROC) analysis [8] was applied.…”
Section: Classifier Performancementioning
confidence: 99%