Augmenting Convolution Neural Networks by Utilizing Attention Mechanism for Knowledge Tracing - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2022

Augmenting Convolution Neural Networks by Utilizing Attention Mechanism for Knowledge Tracing

Résumé

The devastating, ongoing Covid-19 epidemic has led to many students resorting to online education. In order to better guarantee the quality, online education faces severe challenges. There is an important part of online education referred to as Knowledge Tracing (KT). The objective of KT is to estimate students’ learning performance using a series of questions. It has garnered widespread attention ever since it was proposed. Recently, an increasing number of research efforts have concentrated on deep learning (DL)-based KT attributing to the huge success over traditional Bayesian-based KT methods. Most existing DL-based KT methods utilize Recurrent Neural Network and its variants, i.e. Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU) etc. Recurrent neural networks are good at modeling local features, but underperforms at long sequence modeling, so the attention mechanism is introduced to make up for this shortcoming. In this paper, we introduce a DL-based KT model referred to as Convolutional Attention Knowledge Tracing (CAKT) utilizing attention mechanism to augment Convolutional Neural Network (CNN) in order to enhance the ability of modeling longer range dependencies.
Fichier sous embargo
Fichier sous embargo
0 0 10
Année Mois Jours
Avant la publication
mercredi 1 janvier 2025
Fichier sous embargo
mercredi 1 janvier 2025
Connectez-vous pour demander l'accès au fichier

Dates et versions

hal-04178712 , version 1 (08-08-2023)

Licence

Identifiants

Citer

Meng Zhang, Liang Chang, Tieyuan Liu, Chen Wei. Augmenting Convolution Neural Networks by Utilizing Attention Mechanism for Knowledge Tracing. 12th International Conference on Intelligent Information Processing (IIP), May 2022, Qingdao, China. pp.80-86, ⟨10.1007/978-3-031-03948-5_7⟩. ⟨hal-04178712⟩
27 Consultations
18 Téléchargements

Altmetric

Partager

More