Going Deeper with Deep Knowledge Tracing - EDM-2016
Source code and data sets for Going Deeper with Deep Knowledge Tracing
- Tensorflow 0.10 (only tested on this version)
- Numpy
- scikit-learn
git clone https://github.com/siyuanzhao/2016-EDM.git
cd 2016-EDM
python student_model.py
There are serval flags within student_model.py. Some of them are hyper-parameters for the model. Some of them are path to training and testing data.
Check all available flags with the following command.
python student_model.py -h
Run the model on a different data set
python student_model.py --train_data_path=<path-to-your-data> --test_data_path=<path-to-your-data>
You can also set the number of hidden layers and the number of hidden nodes with flags.
python student_model.py --hidden_layer_num=2 --hidden_size=400
- The model uses Adam Optimizer.
- Add gradient noise. arxiv
- Add gradient norm clipping. arxiv
Data sets we used from the paper are in folder data.
ASSISTments 09-10 (a) | ASSISTments 09-10 (b) | ASSISTments 09-10 (c) | |
---|---|---|---|
file name | 0910_a | 0910_b | 0910_c |
Has duplicated records | No | No | No |
Has subproblems | Yes | No | No |
Repeat sequnces for mutiple skills | Yes | Yes | No |
Combined skills for mutiple skills | No | No | Yes |
CAT_train.csv and CAT_test.csv are data files from Cognitive Tutor.
Since I made some changes on the code, I will run the model again and record the results.