2017 IEEE International Conference on Information Reuse and Integration (IRI) 2017
DOI: 10.1109/iri.2017.43
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Transfer Learning Performance Measures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 31 publications
0
3
0
Order By: Relevance
“…However, even though the performance gap is both data-and algorithm-dependent, the metric is considered crucial for a more informative and finer generalization bound. The study by Weiss and Khoshgoftaar [14] provides a discussion of the relative performance analysis of state-of-the-art transfer learning algorithms and traditional machine learning algorithms. Their analysis addresses the question of whether the area under the curve (AUC) performance is predictive of classification accuracy in a transfer learning environment, where there is no labeled target data to perform validation methods.…”
Section: Critical State-of-the-art Review Of Tl Performance Metricsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, even though the performance gap is both data-and algorithm-dependent, the metric is considered crucial for a more informative and finer generalization bound. The study by Weiss and Khoshgoftaar [14] provides a discussion of the relative performance analysis of state-of-the-art transfer learning algorithms and traditional machine learning algorithms. Their analysis addresses the question of whether the area under the curve (AUC) performance is predictive of classification accuracy in a transfer learning environment, where there is no labeled target data to perform validation methods.…”
Section: Critical State-of-the-art Review Of Tl Performance Metricsmentioning
confidence: 99%
“…Our proposition necessitates that we briefly discuss the issues of negative transfer and catastrophic forgetting. One conclusion from [14] is that analyzing the relative performance of TL algorithms across a wide range of distortion profiles should be considered an area for future research. Negative transfer occurs when the source domain data and tasks contribute to lower learning performance in the target domain.…”
Section: Critical State-of-the-art Review Of Tl Performance Metricsmentioning
confidence: 99%
“…This includes noise (either class or attribute), class imbalance, outliers, high-dimensionality, as well others [105][106][107]. These issues have a unique impact on transfer learning tasks as they may present in either the source or the target domain which causes changing circumstances [108][109][110].…”
Section: Comparative Analysismentioning
confidence: 99%