2020
DOI: 10.1007/978-3-030-39752-4_5
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised Semantic Segmentation of Multiple Lumbosacral Structures on CT

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…Pauline et al [28] trained a convolutional semantic segmentation network along with an adversarial network that discriminates segmentation maps coming either from the ground truth or from the segmentation network to detect and correct higher-order inconsistencies between ground truth and the map generated by the segmentation net. Huaqing Liu et al [45] proposed semi-cGAN, based on CGAN, to segment lumbosacral structures on thin-layer computed tomography with a few labeled data. Souly et al [46] leveraged a massive amount of available unlabeled or weakly labeled data and non-real images created through the GAN to achieve semisupervised learning, and subsequently, Hung W C [47] made improvements based on it.…”
Section: Related Workmentioning
confidence: 99%
“…Pauline et al [28] trained a convolutional semantic segmentation network along with an adversarial network that discriminates segmentation maps coming either from the ground truth or from the segmentation network to detect and correct higher-order inconsistencies between ground truth and the map generated by the segmentation net. Huaqing Liu et al [45] proposed semi-cGAN, based on CGAN, to segment lumbosacral structures on thin-layer computed tomography with a few labeled data. Souly et al [46] leveraged a massive amount of available unlabeled or weakly labeled data and non-real images created through the GAN to achieve semisupervised learning, and subsequently, Hung W C [47] made improvements based on it.…”
Section: Related Workmentioning
confidence: 99%