In the field of deep learning, for problems and tasks that are sensitive to time series, such as natural language processing or speech recognition, the recurrent neural network is usually more suitable. Long short-term memory (LSTM) is a representative network structure in recurrent neural network. It is time-dependent and enables a global representation of features. However, some problems such as the network parameters of LSTMs limit the applicability of their solutions. This paper proposes an improved hybrid structure of graph convolutional neural network and recurrent neural network. In the input layer, a two-dimensional convolutional neural network is used to build a text corpus map. Graphic embedding is used to preserve the global structure of the entire text graph structures. The LSTM layer and attention mechanism are used to fully implement text classification and improve the computational efficiency. The test results show that the hybrid network structure has better operation speed on the IMDb dataset.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.