2020 IEEE 4th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC) 2020
DOI: 10.1109/itnec48623.2020.9084784
|View full text |Cite
|
Sign up to set email alerts
|

Chinese Text Sentiment Analysis Based on BI-GRU and Self-attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(7 citation statements)
references
References 4 publications
0
6
0
Order By: Relevance
“…The BiGRU-Attention neural network model proposed in [20]. c. The BiLSTM-Attentions model proposed in [21]. This model combines BiLSTM with two layers of Attention.…”
Section: Comparison Experiments and Parameter Settingsmentioning
confidence: 99%
“…The BiGRU-Attention neural network model proposed in [20]. c. The BiLSTM-Attentions model proposed in [21]. This model combines BiLSTM with two layers of Attention.…”
Section: Comparison Experiments and Parameter Settingsmentioning
confidence: 99%
“…To incorporate contextual information, RNNs introduce an in-memory unit to retain a slice of crucial features so that long-distance dependencies between tokens can be captured. The classical modified type of RNNs, long short-term memory [ 36 ], and gated recurrent units [ 37 ], are currently the most popular networks used for text sentiment analysis. In addition, a sequential model with an attention mechanism has been effectively proved in sentiment analysis [ 38 ].…”
Section: Related Workmentioning
confidence: 99%
“…(3) Atten-BiGRU [16]: combining the attention mechanism with BiGRU to learn text features more precisely according to the context and extract deep semantic features of the text, with less number of parameters and shorter training time compared to LSTM.…”
Section: Benchmark Experimentsmentioning
confidence: 99%