This is an official implementation of ICML 24 paper Irregular Multivariate Time Series Forecasting: A Transformable Patching Graph Neural Networks Approach.
As illustrated in the figure, given a set of historical Irregular Multivariate Time Series (IMTS) observations and forecasting queries, the IMTS forecasting problem aims to accurately forecast the values in correspondence to these queries.
We build a new comprehensive benchmark for the problem of Irregular Multivariate Time Series Forecasting, including four scientific datasets covering areas of healthcare, biomechanics, and climate science.
For Physionet, Human Activity, and USHCN, we have provided the processed datasets to ease the reproduction.
For MIMIC, because of the PhysioNet Credentialed Health Data License, you need to first request the raw database from here. The database version we used here is v1.4. After downloading the raw data, following the preprocessing of Neural Flows, you will finally get the full_dataset.csv
which is used as the raw data in our experiment.
The main results:
The overview of t-PatchGNN, which initially divides each univariate irregular time series into a series of transformable patches with varying number of consecutive observations but maintains a unified time horizon resolution. Then the patching outcomes can be seamlessly modeled by Transformer and time-adaptive GNNs, which incorporate the time-varying adaptive graph structure learning (GSL), to realize an effective intra- and inter-time series modeling for IMTS.
t-PatchGNN has been tested using Python 3.9 and CUDA Version: 12.2
To have consistent libraries and their versions, you can install the needed dependencies for this project by running the following command:
pip install -r requirements.txt
For the specific configurations utilized to obtain the principal experimental outcomes presented in the paper, kindly refer to the script "run_all.sh". To replicate these results, please execute the below command.
sh ./tPatchGNN/scripts/run_all.sh
Example:
python run_models.py \
--dataset {dataset} --state {def_or_debug} --history {length_of_observation}\
--patience {patience_of_earlystopping} --batch_size {batch_size} --lr {learning_rate} \
--patch_size {window_size_for_a_patch} \
--stride {period_stride_for_patch_sliding} \
--nhead {heads_in_Transformer} \
--tf_layer{number_of_layer_in_Transformer} \
--nlayer {num_of_layer_in_Time_Series_Model} \
--te_dim {number_of_units_for_time_encoding} \
--node_dim {number_of_units_for_node_vectors} \
--hid_dim {number_of_units_per_hidden_layer} \
--outlayer {model_for_outlayer} --seed {seed} --gpu {gpu}
dataset
: the dataset name, select from[physionet, mimic, activity, ushcn]
.seed
: the seed for parameter initialization.history
: the length of the time for observation, the rest will be used for forecasting. Please note that different datasets have varying time spans and levels of granularity.
@inproceedings{zhangirregular2024,
title={Irregular Multivariate Time Series Forecasting: A Transformable Patching Graph Neural Networks Approach},
author={Zhang, Weijia and Yin, Chenlong and Liu, Hao and Zhou, Xiaofang and Xiong, Hui},
booktitle={Forty-first International Conference on Machine Learning},
year={2024}
}