2020
DOI: 10.1088/1742-6596/1575/1/012130
|View full text |Cite
|
Sign up to set email alerts
|

A Hybrid GCN and RNN Structure Based on Attention Mechanism for Text Classification

Abstract: In the field of deep learning, for problems and tasks that are sensitive to time series, such as natural language processing or speech recognition, the recurrent neural network is usually more suitable. Long short-term memory (LSTM) is a representative network structure in recurrent neural network. It is time-dependent and enables a global representation of features. However, some problems such as the network parameters of LSTMs limit the applicability of their solutions. This paper proposes an improved hybrid… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 7 publications
0
1
0
Order By: Relevance
“…It only needs the development of codec models in the graph method and softmax can be used on every node. Graph convolutional network [17] tend to recover the missing entities/relationships that could also execute the task of entity classification. The test result on few datasets shows that these network frameworks can improve the problems that arise from lost data in the knowledge base.…”
Section: Phase Ii: Data Classificationmentioning
confidence: 99%
“…It only needs the development of codec models in the graph method and softmax can be used on every node. Graph convolutional network [17] tend to recover the missing entities/relationships that could also execute the task of entity classification. The test result on few datasets shows that these network frameworks can improve the problems that arise from lost data in the knowledge base.…”
Section: Phase Ii: Data Classificationmentioning
confidence: 99%