Deep Graph Memory Networks for Forgetting-Robust Knowledge Tracing

18 Aug 2021  ·  Ghodai Abdelrahman, Qing Wang ·

Tracing a student's knowledge is vital for tailoring the learning experience. Recent knowledge tracing methods tend to respond to these challenges by modelling knowledge state dynamics across learning concepts. However, they still suffer from several inherent challenges including: modelling forgetting behaviours and identifying relationships among latent concepts. To address these challenges, in this paper, we propose a novel knowledge tracing model, namely \emph{Deep Graph Memory Network} (DGMN). In this model, we incorporate a forget gating mechanism into an attention memory structure in order to capture forgetting behaviours dynamically during the knowledge tracing process. Particularly, this forget gating mechanism is built upon attention forgetting features over latent concepts considering their mutual dependencies. Further, this model has the capability of learning relationships between latent concepts from a dynamic latent concept graph in light of a student's evolving knowledge states. A comprehensive experimental evaluation has been conducted using four well-established benchmark datasets. The results show that DGMN consistently outperforms the state-of-the-art KT models over all the datasets. The effectiveness of modelling forgetting behaviours and learning latent concept graphs has also been analyzed in our experiments.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here