2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI) 2021
DOI: 10.1109/isbi48211.2021.9434167
|View full text |Cite
|
Sign up to set email alerts
|

Multimix: Sparingly-Supervised, Extreme Multitask Learning from Medical Images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…This will improve the generalizability of the CNN models. MTL can be defined as the optimization of several losses in a single model such that shared representation learning can execute related tasks [4]. Multi-task learning in the context of advanced learning is often carried out using either soft or hard parameters sharing of the convolution layer.…”
Section: Multi-task Learning Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This will improve the generalizability of the CNN models. MTL can be defined as the optimization of several losses in a single model such that shared representation learning can execute related tasks [4]. Multi-task learning in the context of advanced learning is often carried out using either soft or hard parameters sharing of the convolution layer.…”
Section: Multi-task Learning Methodsmentioning
confidence: 99%
“…This will improve the generalization of the model. MTL can be defined as the optimization of several loss functions in a single model such that shared representation learning can execute related tasks [4].…”
Section: Introductionmentioning
confidence: 99%
“…Comparisons with multi-task learning methods. We further compared the performance of the proposed model with several state-of-the-art MTL methods [34,63,64] for normal tissue and lesion discrimination on the THH dataset to verify the effectiveness of our proposed cross-task guidance. Based on Table 4, CTG-Net has a significant advantage on most of the metrics compared to other methods.…”
Section: Plos Onementioning
confidence: 99%
“…Semi-supervised learning combines a small amount of labeled data with a large amount of unlabeled data during training (Chapelle, Scholkopf, and Zien 2009). As a result, semi-supervised learning has received much attention as an alternative to fully-labeled datasets, and many works and methods have been proposed (Chen et al 2020;Sohn et al 2020;Haque et al 2020;Laine and Aila 2017;Imran et al 2020;Sun et al 2019;Imran and Terzopoulos 2019b). In many real-world tasks, there are large datasets with only a small subset of labeled data, as annotations require domain expertise, are expensive, and are time consuming.…”
Section: Introductionmentioning
confidence: 99%