2018 IEEE International Conference on Pervasive Computing and Communications (PerCom) 2018
DOI: 10.1109/percom.2018.8444572
|View full text |Cite
|
Sign up to set email alerts
|

Stratified Transfer Learning for Cross-domain Activity Recognition

Abstract: In activity recognition, it is often expensive and time-consuming to acquire sufficient activity labels. To solve this problem, transfer learning leverages the labeled samples from the source domain to annotate the target domain which has few or none labels. Existing approaches typically consider learning a global domain shift while ignoring the intra-affinity between classes, which will hinder the performance of the algorithms. In this paper, we propose a novel and general cross-domain learning framework that… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
106
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 183 publications
(116 citation statements)
references
References 39 publications
0
106
0
Order By: Relevance
“…• STL: Strati ed Transfer Learning [40]. PCA and KPCA are classic dimensionality reduction methods, while TCA, GFK, TKL, and STL are representative transfer learning approaches.…”
Section: Evaluation Of Tnnarmentioning
confidence: 99%
See 2 more Smart Citations
“…• STL: Strati ed Transfer Learning [40]. PCA and KPCA are classic dimensionality reduction methods, while TCA, GFK, TKL, and STL are representative transfer learning approaches.…”
Section: Evaluation Of Tnnarmentioning
confidence: 99%
“…e constructed datasets and STL code are available online 2 . e implementations of all comparison methods are following [40]. Di erent from these work which exploited feature extraction according to human knowledge, we take the original signal as the input.…”
Section: Evaluation Of Tnnarmentioning
confidence: 99%
See 1 more Smart Citation
“…Transfer learning (TL), or domain adaptation [12] is a promising strategy to enhance the learning performance on a target domain with few or none labels by leveraging knowledge from a well-labeled source domain. Since the source and target domains have dif- ferent distributions, numerous methods have been proposed to reduce the distribution divergence [24,23,21,26,6]. Unfortunately, despite the great success achieved by existing TL methods, it is notoriously challenging to apply them to a real situation since we cannot determine the best TL model and their optimal hyperparameters.…”
Section: Introductionmentioning
confidence: 99%
“…This paper is an extended version of our PerCom paper [20], where we proposed a stratified transfer learning algorithm for activity transfer. That algorithm is regarded as STL-SAT in this paper.…”
Section: Introductionmentioning
confidence: 99%