2022
DOI: 10.1016/j.neucom.2022.04.021
|View full text |Cite
|
Sign up to set email alerts
|

ASS-GAN: Asymmetric semi-supervised GAN for breast ultrasound image segmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 34 publications
(16 citation statements)
references
References 19 publications
0
8
0
Order By: Relevance
“…The two generators may supervise each other, generating reliable segmentation predicted masks as advice for each other in the absence of labels. The experimental findings demonstrate that the suggested strategy performs well even when there are a limited number of tagged photos [88]. To obtain more accurate medical picture segmentation, x-Net, a novel deep model, is presented.…”
Section: Neurosciencementioning
confidence: 88%
“…The two generators may supervise each other, generating reliable segmentation predicted masks as advice for each other in the absence of labels. The experimental findings demonstrate that the suggested strategy performs well even when there are a limited number of tagged photos [88]. To obtain more accurate medical picture segmentation, x-Net, a novel deep model, is presented.…”
Section: Neurosciencementioning
confidence: 88%
“…Multiple attention mechanisms, as well as a sequential training strategy using a weakly-supervised learning scheme, are employed to improve both classification and localization performance on partially annotated datasets. Future studies will be conducted to comprehensively investigate innovative methods to further improve model performance, such as advanced attention module design 7,17 and representation learning via self-supervised learning. 18,19…”
Section: Discussionmentioning
confidence: 99%
“…The application of the proposed model implied the viability of a dual intended automated system with segmentation and extraction of low-dimensional deep radiomics from ROI. The proposed model exhibited relatively considerable accuracy compared to the state-of-art models, i.e., ASS-GANs [ 54 ], and W-Net [ 55 ], yet the combination of concurrent low-dimensional radiomics increases the contribution of this model, which significantly reduces the training process and required data compared cascading multiple models. Similarly, our model exhibited considerable growth in model performance for classifying breast cancer patients from benign cases ( Figure 7 , and Table 2 ).…”
Section: Discussionmentioning
confidence: 99%