2019
DOI: 10.30630/joiv.3.4.289
|View full text |Cite
|
Sign up to set email alerts
|

Efficient processing of GRU based on word embedding for text classification

Abstract: Text classification has become very serious problem for big organization to manage the large amount of online data and has been extensively applied in the tasks of Natural Language Processing (NLP). Text classification can support users to excellently manage and exploit meaningful information require to be classified into various categories for further use. In order to best classify texts, our research efforts to develop a deep learning approach which obtains superior performance in text classification than ot… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
5

Relationship

3
7

Authors

Journals

citations
Cited by 61 publications
(38 citation statements)
references
References 20 publications
0
30
0
Order By: Relevance
“…A very effective way of handling the embedding vectors using GRU states is a plus for many Natural language processing (NLP) problems such as text classification. The effec-tiveness of GRU is proved with the help of an experimental study conducted by the [27] researchers. They used TREC with 6 distinct classes and Google snippet with 8 distinct classes dataset for proving the strength of GRU with the NLP process for problem-solving.…”
Section: B Deep Learning Techniquesmentioning
confidence: 99%
“…A very effective way of handling the embedding vectors using GRU states is a plus for many Natural language processing (NLP) problems such as text classification. The effec-tiveness of GRU is proved with the help of an experimental study conducted by the [27] researchers. They used TREC with 6 distinct classes and Google snippet with 8 distinct classes dataset for proving the strength of GRU with the NLP process for problem-solving.…”
Section: B Deep Learning Techniquesmentioning
confidence: 99%
“…O Yildirim [28] [25] proposed a unified structure to investigate the effects of word embedding and Gated Recurrent Unit (GRU) for text classification on two benchmark datasets included (Google snippets and TREC). GRU is a well-known type of recurrent neural network (RNN), which is the ability to compute sequential data over its recurrent architecture.…”
Section: -Literature Surveymentioning
confidence: 99%
“…In [3] the DBN jointly perform with SVM to achieved better results of Chinese text classification algorithm for labeling of semantic role presents CNN in [14]. For classification of long term sentences both are [15,16]. One of the alternates of the traditional CNN is Network In Network (NIN) proposed by [17], where the 1*1 " Conv-filter used is a Multi-Layer Perceptron (MLP) instead of the conventional linear filters Indonesian J Elec Eng & Comp Sci ISSN: 2502-4752 …”
Section: Related Workmentioning
confidence: 99%