2022
DOI: 10.48550/arxiv.2206.10177
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

TCJA-SNN: Temporal-Channel Joint Attention for Spiking Neural Networks

Abstract: Spiking Neural Networks (SNNs) is a practical approach toward more data-efficient deep learning by simulating neurons leverage on temporal information. In this paper, we propose the Temporal-Channel Joint Attention (TCJA) architectural unit, an efficient SNN technique that depends on attention mechanisms, by effectively enforcing the relevance of spike sequence along both spatial and temporal dimensions. Our essential technical contribution lies on: 1) compressing the spike stream into an average matrix by emp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(25 citation statements)
references
References 12 publications
0
25
0
Order By: Relevance
“…While the TRF module based on temporal convolution is a straightforward linear computation, the FLI module incorporates non-linear components, resulting in a non-linear relationship between temporal dependencies that improves spatio-temporal feature extraction. We notice that the function of this structure closely resembles that of the attention module; therefore, we refer to the attention blocks (Hu et al, 2018;Yao et al, 2021;Zhu et al, 2022), and propose the FLI module to replicate the gating mechanism in synaptic connections. The module details are shown in Figure 3C.…”
Section: Feedforward Lateral Inhibitionmentioning
confidence: 89%
See 4 more Smart Citations
“…While the TRF module based on temporal convolution is a straightforward linear computation, the FLI module incorporates non-linear components, resulting in a non-linear relationship between temporal dependencies that improves spatio-temporal feature extraction. We notice that the function of this structure closely resembles that of the attention module; therefore, we refer to the attention blocks (Hu et al, 2018;Yao et al, 2021;Zhu et al, 2022), and propose the FLI module to replicate the gating mechanism in synaptic connections. The module details are shown in Figure 3C.…”
Section: Feedforward Lateral Inhibitionmentioning
confidence: 89%
“…The SE block (Hu et al, 2018) offers an efficient attention approach to improve representations in ANNs. Xie et al (2016) and Kundu et al (2021) introduced spatial-wise attention in SNNs; then, TA-SNN (Yao et al, 2021) developed a temporal-wise attention mechanism in SNNs by assigning attention factors to each input frame; more subsequently, TCJA (Zhu et al, 2022) added a channel-wise attention module and proposed temporal-channel joint attention. These studies demonstrate the usefulness of attention mechanisms in SNNs by achieving state-of-the-art results on various datasets.…”
Section: Attention Modules In Snnsmentioning
confidence: 99%
See 3 more Smart Citations