2020
DOI: 10.48550/arxiv.2006.04096
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Robust Learning Through Cross-Task Consistency

Abstract: Visual perception entails solving a wide set of tasks, e.g., object detection, depth estimation, etc. The predictions made for multiple tasks from the same image are not independent, and therefore, are expected to be 'consistent'. We propose a broadly applicable and fully computational method for augmenting learning with Cross-Task Consistency. 1 The proposed formulation is based on inference-path invariance over a graph of arbitrary tasks. We observe that learning with cross-task consistency leads to more acc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 37 publications
0
4
0
Order By: Relevance
“…Epistemic uncertainty accounts for uncertainty in the model parameters, while aleatoric uncertainty stems from the noise inherent in the data. There are many proposed methods to estimate the former, such as using dropout [12,46], stochastic variational inference methods [5,14,37,36,50,42], ensembling [32], and consistency energy [57] where a single uncalibrated uncertainty estimate is extracted from consistency of different paths. Most of the existing methods in this area solely estimate uncertainty without using it towards improving the predictions.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Epistemic uncertainty accounts for uncertainty in the model parameters, while aleatoric uncertainty stems from the noise inherent in the data. There are many proposed methods to estimate the former, such as using dropout [12,46], stochastic variational inference methods [5,14,37,36,50,42], ensembling [32], and consistency energy [57] where a single uncalibrated uncertainty estimate is extracted from consistency of different paths. Most of the existing methods in this area solely estimate uncertainty without using it towards improving the predictions.…”
Section: Related Workmentioning
confidence: 99%
“…Enforcing consistency constraints in the context of cross-task predictions involves ensuring that the output predictions remain the same regardless of the intermediate domain [57,34,60,53]. Particularly in contrast to [57] which uses (non-probabilistic) training-time consistency constraints to improve a network's prediction and does not have any consolidation mechanism, our goal is to robustify the final prediction by merging the output of multiple prediction paths at the test time. Our formulation and the training-time consistency constraints are complimentary.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations