2020 IEEE Winter Conference on Applications of Computer Vision (WACV) 2020
DOI: 10.1109/wacv45572.2020.9093343
|View full text |Cite
|
Sign up to set email alerts
|

GradMix: Multi-source Transfer across Domains and Tasks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…The selection of the specific feature is dependent on the connection between the attribute-object pair provided in an external knowledge base. It can also help attribute-based domain and task transfer [83], [84]. Another direction is to extend our model to recognize more complex composite concepts whose cardinalities are larger than two (e.g.…”
Section: Discussionmentioning
confidence: 99%
“…The selection of the specific feature is dependent on the connection between the attribute-object pair provided in an external knowledge base. It can also help attribute-based domain and task transfer [83], [84]. Another direction is to extend our model to recognize more complex composite concepts whose cardinalities are larger than two (e.g.…”
Section: Discussionmentioning
confidence: 99%
“…However, most target classes of ImageNet, such as "flowers", "animals", and "foods," are irrelevant to the human activity classes, like "push-ups", "dribbling", or "pick-up" (see Figure 2). On the contrary, BU101 is a set of data from the web containing static images that depict a human action, as shown in Figure 3, and has been widely used to enhance the performance of activity identification in video sequences [64,131,134,135]. For the aforementioned reasons, BU101 is selected for training our networks before these are transferred and trained on the target datasets.…”
Section: Previously Trained On Imagenet and Bu101mentioning
confidence: 99%
“…Luo et al (2020) continued that line and suggested a method in the context of few-shot transfer learning, showing that using even a few samples from the target task can significantly improve the transferability of the trained models. Li et al (2020) presented a similar idea but suggested adjusting learning rates for each layer as a mean to improve the cosine similarity of different tasks.…”
Section: Few-shot Transfer Learningmentioning
confidence: 99%