Proceedings of the 30th ACM International Conference on Information &Amp; Knowledge Management 2021
DOI: 10.1145/3459637.3482259
|View full text |Cite
|
Sign up to set email alerts
|

Geometric Heuristics for Transfer Learning in Decision Trees

Abstract: Motivated by a network fault detection problem, we study how recall can be boosted in a decision tree classifier, without sacrificing too much precision. This problem is relevant and novel in the context of transfer learning (TL), in which few target domain training samples are available. We define a geometric optimization problem for boosting the recall of a decision tree classifier, and show it is NP-hard. To solve it efficiently, we propose several near-linear time heuristics, and experimentally validate th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 30 publications
(43 reference statements)
0
1
0
Order By: Relevance
“…If the attribute value is continuous, it must first be discrete by processing it to make it discrete. The second stage of decision tree optimization is to cut out the isolated points and noise generated in the training data so that the constructed decision tree can be compared with the values of each attribute of the sample data to achieve the purpose of classifying and mining the unknown sample data [32][33]. The generation process of the decision tree algorithm is shown in Figure 2.…”
Section: Decision Treesmentioning
confidence: 99%
“…If the attribute value is continuous, it must first be discrete by processing it to make it discrete. The second stage of decision tree optimization is to cut out the isolated points and noise generated in the training data so that the constructed decision tree can be compared with the values of each attribute of the sample data to achieve the purpose of classifying and mining the unknown sample data [32][33]. The generation process of the decision tree algorithm is shown in Figure 2.…”
Section: Decision Treesmentioning
confidence: 99%