2021
DOI: 10.7717/peerj-cs.570
|View full text |Cite
|
Sign up to set email alerts
|

A comparative analysis on question classification task based on deep learning approaches

Abstract: Question classification is one of the essential tasks for automatic question answering implementation in natural language processing (NLP). Recently, there have been several text-mining issues such as text classification, document categorization, web mining, sentiment analysis, and spam filtering that have been successfully achieved by deep learning approaches. In this study, we illustrated and investigated our work on certain deep learning approaches for question classification tasks in an extremely inflected… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 22 publications
(14 citation statements)
references
References 35 publications
0
4
0
Order By: Relevance
“…The research (Zulqarnain, et al, 2023) is focused on attention-aware deep learning approaches for an effective stress classification domain. The E-LSTM, which is an enhancement on traditional LSTM and combines pre-attention, resulted in superior performance when compared to other classification methods, including Naïve Bayesian, SVM, deep belief network, and standard LSTM.…”
Section: Acknowledgmentsmentioning
confidence: 99%
“…The research (Zulqarnain, et al, 2023) is focused on attention-aware deep learning approaches for an effective stress classification domain. The E-LSTM, which is an enhancement on traditional LSTM and combines pre-attention, resulted in superior performance when compared to other classification methods, including Naïve Bayesian, SVM, deep belief network, and standard LSTM.…”
Section: Acknowledgmentsmentioning
confidence: 99%
“…-Compute the update gate 𝒛 𝒕 , and reset gate 𝒓 𝒕 , using equations ( 8), (7) -Compute the candidate state ĥ 𝒕 , using equation ( 9) -while stopping criteria did not met do -while training for all instances do -Train a mini-batch dataset as network input.…”
Section: Step 3: Model Training and Validationmentioning
confidence: 99%
“…In this proposed architecture, auto encoder layers were added with GRU network to enhance its ability for dimensionality reduction. Moreover, a learning algorithm of typical GRU architecture with AE layers is carried out two key phases: 1) To start, it effectively reduces the input features vector 𝑥, and provides the input raw features in the form of highly compressed presentation: 2) The output 𝑥̅ produced by equation ( 5) is then further computed 𝑥̅ as inputs to the GRU network, and evaluating by update gate, and reset gate based on the equations ( 8) and (7), represented in section 2.…”
Section: The Proposed Nae-gru Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…In this case, the problem consists in finding a correspondence between questions and answers that are usually textual elements (sentences, summaries, etc.). Machine learning is very efficient for this type of problem [12,10,25], even though some rule-based approaches have also shown interesting results [14,6,15]. In this type of application, the problem consists in identifying fine-level class labels (and as a consequence a large number of classes) on a semantic basis, such labels being used as keys for accessing the answers [10].…”
Section: Introductionmentioning
confidence: 99%