2012
DOI: 10.1016/j.eswa.2012.01.013
|View full text |Cite
|
Sign up to set email alerts
|

Bagging schemes on the presence of class noise in classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
31
0
1

Year Published

2014
2014
2020
2020

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 57 publications
(33 citation statements)
references
References 16 publications
1
31
0
1
Order By: Relevance
“…Other approaches are variants of the Bagging and Boosting techniques (Abellán & Masegosa, 2012;Cantador & Dorronsoro, 2005;Cao, Kwong, & Wang, 2012). Here, the same classifier is trained by using different samples from the dataset, leading to different class boundaries that are combined afterwards.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Other approaches are variants of the Bagging and Boosting techniques (Abellán & Masegosa, 2012;Cantador & Dorronsoro, 2005;Cao, Kwong, & Wang, 2012). Here, the same classifier is trained by using different samples from the dataset, leading to different class boundaries that are combined afterwards.…”
Section: Related Workmentioning
confidence: 99%
“…Here, the same classifier is trained by using different samples from the dataset, leading to different class boundaries that are combined afterwards. The main argument behind the Bagging schemes devised by Abellán and Masegosa (2012) in particular is that it is expected that each mislabeled instance will be part of just a few of the generated training sets, thus a majority of the boundaries will not be influenced by such instance. Even though the proposed schemes usually improve the performance, their success still strongly depends on which instances were mislabeled.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Besides, its use with bagging ensemble ( Abellán & Mantas, 2014;Abellán & Masegosa, 2009a;2012a ) and its above mentioned extension ( Mantas & Abellán, 2014a ) are especially suitable when noisy data are classified. A complete and recent revision of machine learning methods to manipulate label noise can be found in Frenay and Verleysen (2014) .…”
Section: Introductionmentioning
confidence: 99%
“…This model obtains improvements with respect to other known ensembles of classifiers used in this type of setting: the bagging scheme with the C4.5 model and the known classifier Random Forest (RF). It is shown in the literature that the bagging scheme with the C4.5 model is normally the winning model in many studies about classification noise [23,24].…”
Section: Introductionmentioning
confidence: 99%