论文标题:A Self-Attentive model for Knowledge Tracing 本文提出了一种基于自注意力机制的知识追踪模型 Self Attentive Knowledge Tracing (SAKT)。其本质是用 Transformer 的 encoder 部分来做序列任务。 任务描述 利用学生的交互序列 X=(x1,x2,…,xt) 其中xt=(et,rt) 预测学生下次的表现即预测 p(rt+1=1...
这篇论文提出了一种基于自注意力机制的知识追踪模型SAKT(Self Attentive Knowledge Tracing)。 该模型试图解决以下问题: 当前基于RNN的深度知识追踪模型DKT和DKVMN在处理稀疏数据时泛化性能不佳。 DKT模型的参数不可解释。 基于RNN的模型计算效率较低。 为解决上述问题,论文提出了以下方法: 使用纯注意力机制替代RNN,...
In this paper, we propose a novel Sequential Self-Attentive model for Knowledge Tracing (SSAKT). SSAKT utilizes question information based on Multidimensional Item Response Theory (MIRT) which can capture the relations between questions and skills. Then SSAKT uses a self-attention layer to capture...
or position) to be sitting with both feet on the floor, hands at rest, gazing frontward. Such posture is imagined to reflect attentive studenthood, and indeed, is used as a measure of engagement in learning analytics
Although structures such as positional encoding or forgetting gate have already been used in Knowledge Tracing models, positional information with great potential is not fully utilized. In this paper, we propose a Position-aware Self-Attentive Knowledge Tracing (PAKT) model with a position supervision...