2022
DOI: 10.1007/s10489-022-03695-x
|View full text |Cite
|
Sign up to set email alerts
|

HydaLearn

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 32 publications
0
1
0
Order By: Relevance
“…The loss function of multi-task is usually linear addition of the loss function of multiple single tasks, so the weight corresponding to each task needs to be obtained. Common dynamic algorithms have Coefficient of variations Weighting (CV-Weighting) [36] and Highly Dynamic Learning [37] and SLAW(Scaled Loss Approximate Weighting) [38] . Based on the comprehensive consideration of the research focus of the method in this paper and the sample size of the data set, the manual adjustment method is adopted to select the loss weights of other subtasks except the single sample subtask of the left sample.…”
Section: Subtask Loss Function Weights Designmentioning
confidence: 99%
“…The loss function of multi-task is usually linear addition of the loss function of multiple single tasks, so the weight corresponding to each task needs to be obtained. Common dynamic algorithms have Coefficient of variations Weighting (CV-Weighting) [36] and Highly Dynamic Learning [37] and SLAW(Scaled Loss Approximate Weighting) [38] . Based on the comprehensive consideration of the research focus of the method in this paper and the sample size of the data set, the manual adjustment method is adopted to select the loss weights of other subtasks except the single sample subtask of the left sample.…”
Section: Subtask Loss Function Weights Designmentioning
confidence: 99%