2019
DOI: 10.3390/fi11110237
|View full text |Cite
|
Sign up to set email alerts
|

Feature Fusion Text Classification Model Combining CNN and BiGRU with Multi-Attention Mechanism

Abstract: Convolutional neural networks (CNN) and long short-term memory (LSTM) have gained wide recognition in the field of natural language processing. However, due to the pre- and post-dependence of natural language structure, relying solely on CNN to implement text categorization will ignore the contextual meaning of words and bidirectional long short-term memory (BiLSTM). The feature fusion model is divided into a multiple attention (MATT) CNN model and a bi-directional gated recurrent unit (BiGRU) model. The CNN m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 34 publications
(17 citation statements)
references
References 26 publications
(30 reference statements)
0
17
0
Order By: Relevance
“…With the continuous development of deep learning technologies, several new models have been introduced. In particular, the bidirectional gated recurrent unit (BiGRU), developed for a purpose similar to that of our models, is being introduced into text classification technology [28]. Similar to BiLSTM, BiGRU extends GRU, and it shows better performance than BiLSTM.…”
Section: Discussionmentioning
confidence: 99%
“…With the continuous development of deep learning technologies, several new models have been introduced. In particular, the bidirectional gated recurrent unit (BiGRU), developed for a purpose similar to that of our models, is being introduced into text classification technology [28]. Similar to BiLSTM, BiGRU extends GRU, and it shows better performance than BiLSTM.…”
Section: Discussionmentioning
confidence: 99%
“…GRU networks can only manage sequences from front to back, which leads to information loss. Therefore, bidirectional GRU was widely used in many NLP tasks, including SA, text classification, named entity recognition, and question answering, e.g., [19][20][21][22] to process data in both directions and provide complete context information.…”
Section: Gru Networkmentioning
confidence: 99%
“…Sanda et al [30] conducted a comparative analysis of text feature representation models and verified the impact of text feature representation on the classification effect. In addition, multiple network models were combined [31][32][33][34] to apply feature fusion to text classification tasks with good results. Jang et al [35] proposed a hybrid model of BiLSTM and CNN based on the attention mechanism, which captures the correlation of adjacent words to improve the classification accuracy.…”
Section: Introductionmentioning
confidence: 99%