Proceedings of the 29th ACM International Conference on Information &Amp; Knowledge Management 2020
DOI: 10.1145/3340531.3411994
|View full text |Cite
|
Sign up to set email alerts
|

RKT: Relation-Aware Self-Attention for Knowledge Tracing

Abstract: The world has transitioned into a new phase of online learning in response to the recent Covid19 pandemic. Now more than ever, it has become paramount to push the limits of online learning in every manner to keep flourishing the education system. One crucial component of online learning is Knowledge Tracing (KT). The aim of KT is to model student's knowledge level based on their answers to a sequence of exercises referred as interactions. Students acquire their skills while solving exercises and each such inte… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
50
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 106 publications
(58 citation statements)
references
References 28 publications
1
50
0
Order By: Relevance
“…Along this line, the highorder graph structure information and knowledge concept information can be fully explored for the final student performance prediction. Educational psychology models are mainly discussed from two sides: cognitive diagnosis models and knowledge tracing models [17,20]. Cognitive diagnosis models, assuming students' knowledge states are static throughout their practice, aim to discover students' proficiency to predict their future performance [18,5,28,35,10,8].…”
Section: Introductionmentioning
confidence: 99%
“…Along this line, the highorder graph structure information and knowledge concept information can be fully explored for the final student performance prediction. Educational psychology models are mainly discussed from two sides: cognitive diagnosis models and knowledge tracing models [17,20]. Cognitive diagnosis models, assuming students' knowledge states are static throughout their practice, aim to discover students' proficiency to predict their future performance [18,5,28,35,10,8].…”
Section: Introductionmentioning
confidence: 99%
“…38 To focus on the relevant interactions, the Transformer framework applies a self-attention mechanism to the input data, and hence incorporates the inner relations in the exercise sequences into the network. Attention based KT models inspired by this study have become an active research area, 15,16,41 a representative work is the attentive KT (AKT) model, 16 which obtained the state-of-the-art performance on this task. However, in most of these models, the KT is based on the skills in a specific domain.…”
Section: Deep Learning Modelsmentioning
confidence: 99%
“…This approach loses the distinctive information related to individual questions, leading to imprecise inferences of the learners' knowledge states. 12,[14][15][16] For example, in Figure 1(left), the questions "3+5" and "345+6789" both require the skill "addition of integers," and are considered as the same inputs when building the KT models, which ignores their different difficulty levels. Existing research have proved that question difficulty undoubtedly influences the learner performance, 11,17,18 and the relative difficulty level of a specific question varies from learner to learner.…”
Section: Introductionmentioning
confidence: 99%
“…They use the embedding of skill-response-hint instead of skill-response as the input to acquire the learner's knowledge growth after a response. Pandey and Srivastava propose a Relation-aware self-attention Knowledge Tracing (RKT) model [12] based on SAKT. They take into account the relations between skills and time elapsed since the last interaction to inform the self-attention mechanism.…”
Section: Related Workmentioning
confidence: 99%