2016
DOI: 10.1016/j.neucom.2015.12.092
|View full text |Cite
|
Sign up to set email alerts
|

Flexible multi-task learning with latent task grouping

Abstract: In multi-task learning, using task grouping structure has been shown to be effective in preventing inappropriate knowledge transfer among unrelated tasks. However, the group structure often has to be predetermined using prior knowledge or heuristics, which has no theoretical guarantee and could lead to unsatisfactory learning performance. In this paper, we present a flexible multi-task learning framework to identify latent grouping structures under agnostic settings, where the prior of the latent subspace is u… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(6 citation statements)
references
References 19 publications
0
6
0
Order By: Relevance
“…Although the high-frequency rate of the price change chosen as the auxiliary target is related with the main target(the price movement), both the targets are highly noisy [41,57] so that the multitask learning probably suffers the negative transfer [58] , which will impair the highfrequency price movement prediction. Besides, the feature extractor designed to extract diverse temporalspatial dependency is so complex that it is inevitable to incur more noise [43][44][45] , which will probably further aggravate the negative transfer to degrade the generalization of multitask learning [58] .…”
Section: 1 the Sharing Structurementioning
confidence: 99%
See 1 more Smart Citation
“…Although the high-frequency rate of the price change chosen as the auxiliary target is related with the main target(the price movement), both the targets are highly noisy [41,57] so that the multitask learning probably suffers the negative transfer [58] , which will impair the highfrequency price movement prediction. Besides, the feature extractor designed to extract diverse temporalspatial dependency is so complex that it is inevitable to incur more noise [43][44][45] , which will probably further aggravate the negative transfer to degrade the generalization of multitask learning [58] .…”
Section: 1 the Sharing Structurementioning
confidence: 99%
“…Although the high-frequency rate of the price change chosen as the auxiliary target is related with the main target(the price movement), both the targets are highly noisy [41,57] so that the multitask learning probably suffers the negative transfer [58] , which will impair the highfrequency price movement prediction. Besides, the feature extractor designed to extract diverse temporalspatial dependency is so complex that it is inevitable to incur more noise [43][44][45] , which will probably further aggravate the negative transfer to degrade the generalization of multitask learning [58] . As a result, explicitly separating the INTER modules and the INTRA modules is probably to alleviate the potential detrimental interference between common and task-specific knowledge, which is helpful to use the relevant information of the auxiliary task to improve highfrequency price movement prediction [56] .…”
Section: 1 the Sharing Structurementioning
confidence: 99%
“…Zhang and Yeung presented a convex formulation for multi-task metric learning by modeling the task relationships in the form of a task covariance matrix [42]. Moreover, Zhong et al presented flexible multitask learning framework to identify latent grouping structures in order to restrict negative knowledge transfer [44]. Multitask learning has recently contributed to a number of successful real-world applications that gained better performance by exploiting shared knowledge for multi-task formulation.…”
Section: Multi-task Learning and Applicationsmentioning
confidence: 99%
“…We observe that the vanilla approach of multi-task learning is reaching limited results in our setting (Section 6). Previous work (Liu and Pan, 2017;Zhong et al, 2016;Jeong and Jun, 2018) have suggested that multitask learning should be applied on related tasks. We present an extension to the multi-task learning setting by performing clustering to related tasks to improve performance.…”
Section: Related Workmentioning
confidence: 99%
“…Several studies have shown that separation of tasks into disjoint groups could boost classification performance. Intuitively, multi-task learning among tasks that are mutually related reduces noise in prediction (Liu and Pan, 2017;Zhong et al, 2016;Jeong and Jun, 2018). We present an algorithm that divides all of the tasks, i.e., all entities predictions, into task groups according to their inherent relatedness.…”
Section: Task-grouping Model Architecturementioning
confidence: 99%