A web-based demonstration system of short answer grading based on interactive model building with active learning
Copyright (C) 2021 - Andrew Kwok-Fai Lui, Vanessa Sin-Chun Ng, Stella Cheung Wing-Nga
The Open University of Hong Kong
This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License version 2 as published by the Free Software Foundation.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this program; if not, see http://www.gnu.org/licenses/.
The Short Answer Grading with Active Learning (SAGAL) algorithm aims to optimize the grading examples for training short answer grading models. The underlying active learning approach works in an interactive fashion that involves a cycle of selecting data for annotation and updating the model. PerceptiveGrader is a web-based demonstration system of short answer grading based on SAGAL. Please refer to the following paper for the details.
Lui, A.K.F., Ng, S.C. and Cheung, S.W.N., 2022. An Interactive Short Answer Grading System based on Active Learning Models, presented at the 4th International Confernece on Computer Science and Technologies in Education, 6-8 May, 2022.
The details of SAGAL can be found in the following paper.
Lui, A.K., Ng, S.C. and S.W.N. Cheung (in press), Automated Short Answer Grading with Computer-Assisted Grading Example Acquisition based on Active Learning, Interactive Learning Environment
- SAGAL: A prototype implementation of the Short Answer Grading with Active Learning (SAGAL) algorithm and the programs for the experiments described in the paper.
This project relies on Python 3.7.3. Please download and install it from https://www.python.org/downloads/release/python-373/
Please download and install Conda https://docs.conda.io/projects/conda/en/latest/user-guide/install/macos.html
The project requires the following pre-trained models for text encoding. Please download and put them under data/models/pretrained/{ENCODER_NAME}/ inside the project folder.
{ENCODER_NAME} = google_universal_sentence_encoder
https://tfhub.dev/google/universal-sentence-encoder/4
{ENCODER_NAME} = glove
https://nlp.stanford.edu/projects/glove/
- Create a virtual environment that runs with Python 3.7.3
- Activate it and run
- MacOS
pip install -r requirements_tf2.txt --no-deps
- Windows
pip install -r requirements_windows.txt --no-deps
- MacOS
- Open Terminal / CMD, go to the project root and run:
python web/executor.py
- Create a conda environment that runs with Python 3.9.4
- Install the denpendencies according to VE_M1.yml
- Open Terminal / CMD, go to the project root and run:
python web/executor.py
-
MacOS Double click execute_script.sh.command to run
-
Windows
- Download Git from https://git-scm.com/downloads
- Install Git. Remember to check "Associate .sh file to be run with Bash".
- Double click execute_script.sh to run
The API and client server shall be started and Chrome browser would be opened automatically.
The input dataset should be a csv file of the following format [Question ID, Question Text, Answer ID, Answer Text, Is Reference]
Some csv files are provided under data/datasets/raw/seb2custom as examples
The following shows the screen shots of the running of the browser-based PerceptiveGrader
The main client panel of PerceptiveGrader. The project screen displays the key characteristics of the question-and-answer set. The speciousness ranked data distribution is shown on the right, from which the default values of MASD and ADT are shown. The bottom shows a 2D projection of the data distribution. The annotation panel (left) where the top ranked answers may be assessed and graded. Clicking on an answer row would bring up the answer panel (right) that details neighborhood of the answer.The work described in this paper was fully supported by a grant from the Research Grants Council of the Hong Kong Special Administrative Region, China (UGC/FDS16/E10/19).