This is the official PyTorch implementation of the KDD 2022 paper TARNet: Task-Aware Reconstruction for Time-Series Transformer.
The classification datasets can be found at UEA Archive, UCI Machine Learning Repository, and Towards automatic spatial verification of sensor placement in buildings.
The regression datasets are available in Monash University, UEA, UCR Time Series Regression Archive.
The data directory contains an example of a preprocessed classification dataset: Atrial Fibrillation (AF) and a preprocessed regression dataset: Appliances Energy (AE), along with their corresponding preprocessing files.
The preprocessing.py
file under /data/AF/
can be used to preprocess any classification dataset from UEA Archive by changing the filename
parameter.
The preprocessing.py
file under /data/AE/
can be used to preprocess any regression dataset from Monash University, UEA, UCR Time Series Regression Archive by changing the train_file
and test_file
parameter.
After running the preprocessing.py
files on the raw datasets downloaded from the repositories, store the X_train.npy, y_train.npy, X_test.npy, y_test.npy
files for each dataset under the /data/
directory.
git clone https://github.com/ranakroychowdhury/TARNet.git
Run the script.py
file to train and evaluate the model, like
python3 script.py --dataset AF --task_type classification
This will train the model on the Atrial Fibrillation (AF) dataset and report the accuracy. Similarly, to run a regression dataset, use --task_type regression
.
You can specify any hyper-parameter on the command line and train the model using those hyper-parameters. In this case, uncomment line 52
in utils.py
file. Or you may use the optimized set of hyperparameters for each dataset, stored in hyperparameters.pkl
file. In that case, uncomment line 49
in utils.py
file.
data
contains all preprocessed datasets withX_train.npy, y_train.npy, X_test.npy, y_test.npy
filespreprocessing.py
transforms the raw datasets downloaded from the data repositories to numpy fileshyperparameters.pkl
stores the best set of hyperparameters for each datasetmultitask_transformer_class.py
TARNet model filescript.py
is the main file that loads dataset, initializes, trains and evaluates the modeltransformer.py
transformer encoder, multihead attentionutils.py
contains all the helper functions
If you find the code useful, please cite our paper:
@inproceedings{chowdhury2022tarnet,
title={TARNet: Task-Aware Reconstruction for Time-Series Transformer},
author={Chowdhury, Ranak Roy and Zhang, Xiyuan and Shang, Jingbo and Gupta, Rajesh K and Hong, Dezhi},
booktitle={Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining},
pages={212--220},
year={2022}
}