2022
DOI: 10.1007/s42979-022-01447-9
|View full text |Cite
|
Sign up to set email alerts
|

Multi-domain Attention Fusion Network For Language Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 38 publications
0
1
0
Order By: Relevance
“…As shown in figure 1, the GRU model replaces the input gate, forget gate, and output gate in the LSTM with an update gate and a reset gate. The update gate can control how many states are retained at the previous moment, and the reset gate is to determine whether to merge the previous moment and the current moment [23,24]. The detailed process is as follows: Firstly, two gating states are obtained by outputtingh t−1 at the previous time and the input at the current time.…”
Section: Grumentioning
confidence: 99%
“…As shown in figure 1, the GRU model replaces the input gate, forget gate, and output gate in the LSTM with an update gate and a reset gate. The update gate can control how many states are retained at the previous moment, and the reset gate is to determine whether to merge the previous moment and the current moment [23,24]. The detailed process is as follows: Firstly, two gating states are obtained by outputtingh t−1 at the previous time and the input at the current time.…”
Section: Grumentioning
confidence: 99%