2021 IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
DOI: 10.1109/wacv48630.2021.00151
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Loss Weighting with Coefficient of Variations

Abstract: Many interesting tasks in machine learning and computer vision are learned by optimising an objective function defined as a weighted linear combination of multiple losses. The final performance is sensitive to choosing the correct (relative) weights for these losses. Finding a good set of weights is often done by adopting them into the set of hyper-parameters, which are set using an extensive grid search. This is computationally expensive. In this paper, the weights are defined based on properties observed whi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 34 publications
(23 citation statements)
references
References 31 publications
0
23
0
Order By: Relevance
“…For position and orientation specifically, notice how their computation is completely independent. This allows us optimize their losses L P and L R separately, bypassing the issue where their units are different and thus require re-weighting [5]. This also allows us to train both networks with different sets of data, which is necessary since the behavior of the position and orientation networks should not be correlated.…”
Section: A Training With Synthetic Datamentioning
confidence: 99%
“…For position and orientation specifically, notice how their computation is completely independent. This allows us optimize their losses L P and L R separately, bypassing the issue where their units are different and thus require re-weighting [5]. This also allows us to train both networks with different sets of data, which is necessary since the behavior of the position and orientation networks should not be correlated.…”
Section: A Training With Synthetic Datamentioning
confidence: 99%
“…In order to evaluate the benefits of combinations of two or more alignments, we employ the Multi-Loss Weighting with Coefficient of Variations (Groenendijk et al, 2021) technique (CoV) to calculate a weighted sum of auxiliary losses (Aux) that we add to the main XNLU losses L ic and L ec as follows:…”
Section: Adaptive Weighting Of Auxiliary Lossesmentioning
confidence: 99%
“…Uncertainty is generally used in three main contexts. Firstly, it is used as weight for loss or prediction reweighting [20,35], where the contribution of each part of the loss function or each prediction is weighted based on the certainty level of each task (multi-task learning) [34], or each loss function (single task learning) [20], or each prediction [50]. Secondly, uncertainty is treated as guidance for pseudo label quality estimation for weakly/semisupervised learning [55,80].…”
Section: Uncertainty Related Applicationsmentioning
confidence: 99%