2020
DOI: 10.1007/978-3-030-55130-8_9
|View full text |Cite
|
Sign up to set email alerts
|

Attention-Based Knowledge Tracing with Heterogeneous Information Network Embedding

Abstract: Knowledge tracing is a key area of research contributing to personalized education. In recent times, deep knowledge tracing has achieved great success. However, the sparsity of students' practice data still limits the performance and application of knowledge tracing. An additional complication is that the contribution of the answer record to the current knowledge state is different at each time step. To solve these problems, we propose Attention-based Knowledge Tracing with Heterogeneous Information Network Em… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 8 publications
(4 reference statements)
0
2
0
Order By: Relevance
“…Distinguishing itself from previous work, Sun et al [31] employed the XGBoost algorithm [50] to pre-classify edge features, while Liu et al [43] employed principal component analysis [51] to automatically capture feature representations and incorporated an attention mechanism after the LSTM to assign importance weights to different features. Additionally, Zhang et al [52] leveraged heterogeneous networks to emphasize the significance of edge information. They employed heterogeneous networks to describe the representations of exercises and their attributes and added an attention mechanism after the RNNs model to determine the importance weights of different exercises.…”
Section: Multi-feature Analysis In Knowledge Tracingmentioning
confidence: 99%
“…Distinguishing itself from previous work, Sun et al [31] employed the XGBoost algorithm [50] to pre-classify edge features, while Liu et al [43] employed principal component analysis [51] to automatically capture feature representations and incorporated an attention mechanism after the LSTM to assign importance weights to different features. Additionally, Zhang et al [52] leveraged heterogeneous networks to emphasize the significance of edge information. They employed heterogeneous networks to describe the representations of exercises and their attributes and added an attention mechanism after the RNNs model to determine the importance weights of different exercises.…”
Section: Multi-feature Analysis In Knowledge Tracingmentioning
confidence: 99%
“…Soon after, other solutions incorporated additional characteristics as for instance attention mechanisms (Pandey & Karypis, 2019) trying to generalize the complex representation of acquired knowledge from spare data which is the case of real world data, their approach identifies relevant KC to give them more importance. Subsequently some other authors (Pandey & Srivastava, 2020)(N. Zhang et al, 2020) (Zhu et al, 2020) (Ghosh et al, 2020) proposed later additional attention based solutions.…”
Section: Introductionmentioning
confidence: 99%