2023
DOI: 10.1016/j.media.2022.102676
|View full text |Cite
|
Sign up to set email alerts
|

RadFormer: Transformers with global–local attention for interpretable and accurate Gallbladder Cancer detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 45 publications
0
13
0
Order By: Relevance
“…As a result, the network is compelled to learn a more refined representation of GB malignancy while reconstructing the masked tokens. We report an accuracy of 96.4% using our approach as against 84% by the current SOTA of GBCNet [5] and Radformer [6] 1 .…”
Section: Introductionmentioning
confidence: 89%
See 4 more Smart Citations
“…As a result, the network is compelled to learn a more refined representation of GB malignancy while reconstructing the masked tokens. We report an accuracy of 96.4% using our approach as against 84% by the current SOTA of GBCNet [5] and Radformer [6] 1 .…”
Section: Introductionmentioning
confidence: 89%
“…al [8] later utilized unsupervised contrastive learning to learn malignancy representations. On the other hand, [6] exploits a transformer-based dual-branch architecture for accurate and explainable GBC detection. [21] investigates application of transformers for differentiation of GBC with xanthogranulomatous cholecystitis.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations