2022
DOI: 10.1109/access.2022.3206449
|View full text |Cite
|
Sign up to set email alerts
|

A Survey on Attention Mechanisms for Medical Applications: are we Moving Toward Better Algorithms?

Abstract: The increasing popularity of attention mechanisms in deep learning algorithms for computer vision and natural language processing made these models attractive to other research domains. In healthcare, there is a strong need for tools that may improve the routines of the clinicians and the patients. Naturally, the use of attention-based algorithms for medical applications occurred smoothly. However, being healthcare a domain that depends on high-stake decisions, the scientific community must ponder if these hig… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 203 publications
(214 reference statements)
0
0
0
Order By: Relevance
“…There is, however, some outstanding criticism regarding attention modelling. One issue that the experts mention is that attention models are fitted on top of usually pre-existing CNN backbone architectures [ 60 ]. Therefore, the question remains to be answered: to what extend do the attention models (and in turn, their performance) rely on the backbone architecture on which they are placed?…”
Section: Discussionmentioning
confidence: 99%
“…There is, however, some outstanding criticism regarding attention modelling. One issue that the experts mention is that attention models are fitted on top of usually pre-existing CNN backbone architectures [ 60 ]. Therefore, the question remains to be answered: to what extend do the attention models (and in turn, their performance) rely on the backbone architecture on which they are placed?…”
Section: Discussionmentioning
confidence: 99%
“…The incorporation of attention mechanisms to further refine the feature extraction process is a concept of considerable merit [12]. Leveraging attention mechanisms facilitates automatic weighting of pivotal segments within signals, empowering the model to selectively emphasize crucial information that bolsters fault diagnosis accuracy [13]. In the context of the Squeeze-and-Excitation Network (SENet) introduced by Hu et al [14], inter-channel attention is established, encapsulating the interdependence among feature channels, thereby facilitating adaptive feature recalibration.…”
Section: Introductionmentioning
confidence: 99%