Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence 2017
DOI: 10.24963/ijcai.2017/351
|View full text |Cite
|
Sign up to set email alerts
|

Self-Paced Multitask Learning with Shared Knowledge

Abstract: This paper introduces self-paced task selection to multitask learning, where instances from more closely related tasks are selected in a progression of easier-to-harder tasks, to emulate an effective human education strategy, but applied to multitask machine learning. We develop the mathematical foundation for the approach based on iterative selection of the most appropriate task, learning the task parameters, and updating the shared knowledge, optimizing a new bi-convex loss function. This proposed method app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(16 citation statements)
references
References 1 publication
(2 reference statements)
0
16
0
Order By: Relevance
“…These models have achieved good results in solving small-scale data problems. In addition, with the development of Self-Paced Learning (SPL) [20], many excellent self-paced multi-task Learning models have been proposed, such as SPMTL [21] and spMMTL [22]. SPL is defined as a supervised learning method which gradually adds samples to train from simplicity to difficulty, and it could typically reduce the risk of achieving local optima.…”
Section: B Multi-task Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…These models have achieved good results in solving small-scale data problems. In addition, with the development of Self-Paced Learning (SPL) [20], many excellent self-paced multi-task Learning models have been proposed, such as SPMTL [21] and spMMTL [22]. SPL is defined as a supervised learning method which gradually adds samples to train from simplicity to difficulty, and it could typically reduce the risk of achieving local optima.…”
Section: B Multi-task Learningmentioning
confidence: 99%
“…2) Self-Paced Mean Regularized MTL [22] To address the unequal issue of all tasks, Self-Paced Mean Regularized MTL (spMMTL) is introduced. It is a new This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.…”
Section: Multi-task Learning Algorithmsmentioning
confidence: 99%
“…Recently, self-paced multi-task learning (SPMTL) has been proposed for supervised problems. For instance, [46] proposed a self-paced task selection method for multi-task learning, and [47] proposed a novel multi-task learning framework that learns the tasks by simultaneously considering the order of both tasks and instances.…”
Section: Self-paced Learningmentioning
confidence: 99%
“…Multi-task learning (MTL), the process of learning to solve multiple tasks at the same time, allows sharing information across related tasks, thereby improving model performance across all the tasks (Caruana 1993;Zamir et al 2018). Prior MTL studies can be divided into two categories, one is based on optimization (Li et al 2017;Murugesan and Carbonell 2017), and the other is based on parameter sharing approaches for deep neural networks.…”
Section: Introduction and Related Workmentioning
confidence: 99%