2018
DOI: 10.1007/s10489-018-1314-z
|View full text |Cite
|
Sign up to set email alerts
|

Improving lazy decision tree for imbalanced classification by using skew-insensitive criteria

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 31 publications
0
5
0
Order By: Relevance
“…The root-cause-tracing algorithm proposed in this paper selects old samples similar to a new sample when entering it into the database and combines the two phases to construct a system of equations to find the source of the fault. This method does not use all the previous data, but selects it selectively, so it has the characteristics of being "lazy" [25]. And this root-cause-tracing algorithm not only refers to past experience, but also fully combines the current situation, so can effectively deal with new sudden failures.…”
Section: Construction Of Root-cause-tracing Algorithmmentioning
confidence: 99%
“…The root-cause-tracing algorithm proposed in this paper selects old samples similar to a new sample when entering it into the database and combines the two phases to construct a system of equations to find the source of the fault. This method does not use all the previous data, but selects it selectively, so it has the characteristics of being "lazy" [25]. And this root-cause-tracing algorithm not only refers to past experience, but also fully combines the current situation, so can effectively deal with new sudden failures.…”
Section: Construction Of Root-cause-tracing Algorithmmentioning
confidence: 99%
“…Hellinger Distance (HD) is related to Bhattacharyya Distance and it is part of the f-divergences family [78]. Studies presented in [84], [85] showed that Hellinger Distance can be used in classification. On the current scenario, this distance has been very used in machine learning, even as an alternative to methods such as entropy, aiming to detect failures in the classifiers [86] and breakpoints on the performance of those classifiers [87].…”
Section: ) Hellinger Distancementioning
confidence: 99%
“…On the current scenario, this distance has been very used in machine learning, even as an alternative to methods such as entropy, aiming to detect failures in the classifiers [86] and breakpoints on the performance of those classifiers [87]. Furthermore, according to the literature, Hellinger Distance has been used in many parametric models being very successful on solving problems of statistical estimation [84], [85]. The calculation function is obtained from two probability distributions p and q as follows [85]:…”
Section: ) Hellinger Distancementioning
confidence: 99%
See 1 more Smart Citation
“…To improve the performance of standard DTs, several splitting criteria are proposed to construct DTs in Distinct Class based Splitting Measure (DCSM) [Chandra et al, 2010], Hellinger Distance Decision Tree (HDDT) [Cieslak and Chawla, 2008] and Class Confidence Proportion Decision Tree (CCPDT) [Liu et al, 2010]. Besides these, to deal with class imbalance problem in Lazy DT construction, two skew insensitive split criteria based on Hellinger distance and K-L divergence are proposed in [Su and Cao, 2019]. Since Lazy DTs use the test instance to make splitting decisions, in this paper, we omit it from our discussion.…”
Section: Introductionmentioning
confidence: 99%