2020 IEEE 20th International Conference on Communication Technology (ICCT) 2020
DOI: 10.1109/icct50939.2020.9295665
|View full text |Cite
|
Sign up to set email alerts
|

Design and Implementation of LSTM Accelerator Based on FPGA

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(2 citation statements)
references
References 10 publications
0
1
0
Order By: Relevance
“…While traditional LSTM models only consider time-series dependencies, ConvLSTM combines the strengths of LSTM and convolutional neural networks (CNN), allowing it to effectively capture spatiotemporal dependencies and model both temporal and spatial dimensions [27]. Additionally, traditional LSTM primarily relies on matrix multiplication and element-wise operations [28], which limits its capacity to model data nonlinearly. In contrast, by utilizing multiple filters and the local connectivity of the convolution kernel, ConvLSTM can learn richer feature representations and extract more complex spatiotemporal patterns in the data.…”
Section: Discussionmentioning
confidence: 99%
“…While traditional LSTM models only consider time-series dependencies, ConvLSTM combines the strengths of LSTM and convolutional neural networks (CNN), allowing it to effectively capture spatiotemporal dependencies and model both temporal and spatial dimensions [27]. Additionally, traditional LSTM primarily relies on matrix multiplication and element-wise operations [28], which limits its capacity to model data nonlinearly. In contrast, by utilizing multiple filters and the local connectivity of the convolution kernel, ConvLSTM can learn richer feature representations and extract more complex spatiotemporal patterns in the data.…”
Section: Discussionmentioning
confidence: 99%
“…The control unit structure of LSTM is shown in Figure 1 below. 5 In order to avoid the problem of long-term dependence of time series in the LSTM network structure, a special design was made for RNN neurons. This design simulates human memory patterns and controls memory units through three gates: "forgetting gate", "input gate", and "output gate", as well as a "cell state" to record historical information.…”
Section: Principles Of Lstm Algorithmmentioning
confidence: 99%