2021
DOI: 10.48550/arxiv.2108.07930
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A new semi-supervised inductive transfer learning framework: Co-Transfer

Zhe Yuan,
Yimin Wen

Abstract: In many practical data mining scenarios, such as network intrusion detection, Twitter spam detection, and computer-aided diagnosis, a source domain that is different from but related to a target domain is very common. In addition, a large amount of unlabeled data is available in both source and target domains, but labeling each of them is difficult, expensive, time-consuming, and sometime unnecessary. Therefore, it is very important and worthwhile to fully explore the labeled and unlabeled data in source and t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 20 publications
(29 reference statements)
0
1
0
Order By: Relevance
“…TrAdaBoost is further extended. Yuan et al [14] proposed the Co-Transfer algorithm. The algorithm generates three TrAdaBoost classifiers from the source domain to the target domain at each iteration, and simultaneously generates another three TrAdaBoost classifiers for transfer learning from the target domain to the source domain.…”
Section: Introductionmentioning
confidence: 99%
“…TrAdaBoost is further extended. Yuan et al [14] proposed the Co-Transfer algorithm. The algorithm generates three TrAdaBoost classifiers from the source domain to the target domain at each iteration, and simultaneously generates another three TrAdaBoost classifiers for transfer learning from the target domain to the source domain.…”
Section: Introductionmentioning
confidence: 99%