2018 International Conference on Asian Language Processing (IALP) 2018
DOI: 10.1109/ialp.2018.8629256
|View full text |Cite
|
Sign up to set email alerts
|

Relevance-Based Automated Essay Scoring via Hierarchical Recurrent Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3
2

Relationship

1
9

Authors

Journals

citations
Cited by 20 publications
(6 citation statements)
references
References 12 publications
0
6
0
Order By: Relevance
“…We try to contact the authors to provide information about sub-features used in their research, but there was no response. The system Tpioc-BiLSTM-attention [86] which works via hierarchical recurrent model does not provide any details about the features used in their published research. We also want to emphasize that extraction of a massive number of features vs. 23 features adds to time complexity as well.…”
Section: G Results and Discussionmentioning
confidence: 99%
“…We try to contact the authors to provide information about sub-features used in their research, but there was no response. The system Tpioc-BiLSTM-attention [86] which works via hierarchical recurrent model does not provide any details about the features used in their published research. We also want to emphasize that extraction of a massive number of features vs. 23 features adds to time complexity as well.…”
Section: G Results and Discussionmentioning
confidence: 99%
“…We compared our HNN-AES model with the following baselines: LSTM-CNN-ATT (Dong et al, 2017), Topic-BiLSTM-attention (Chen & Li, 2018), and SSS-AES (Janda et al, 2019). LSTM-CNN-ATT obtains essay representations via a hierarchical sentence-document model.…”
Section: Accuracymentioning
confidence: 99%
“…The workflow of machine learning framework reported in Taghipour and Ng (2016) is illustrated in Figure 4. Looking at the recent trends in AES, the machine learning framework is gaining popularity in recent years due to the efficacy of SVM (Ratna et al, 2019b;Ratna, et al, 2019a;Xu et al, 2017;Awaida et al, 2019;Chen & Li, 2018) and the ability to represent text context with word embedding (Liang et al, 2018;Taghipour & Ng, 2016) propelled by Artificial Neural Network (ANN) (Loraksa & Peachavanish, 2007;Taghipour & Ng, 2016;Dong & Zhang, 2016;Liang et al, 2018).…”
Section: Machine Learning Frameworkmentioning
confidence: 99%