This repository contains the PyTorch implementation for the paper Modeling Irregular Time Series with Continuous Recurrent Units by Mona Schirmer, Mazin Eltayeb, Stefan Lessmann and Maja Rudolph published at ICML 2022.
Schirmer, M., Eltayeb, M., Lessmann, S., & Rudolph, M. (2022, June). Modeling irregular time series with continuous recurrent units. In International Conference on Machine Learning (pp. 19388-19405). PMLR.
The code uses Python3 and PyTorch as auto-differentiation package. To set up an environment with the required packages, run
conda create -n cru python=3.9.7
conda activate cru
pip install -r requirements.txt
By default, datasets are generated or downloaded and processed when run the first time.
- Pendulum interpolation
python run_experiment.py --dataset pendulum --task interpolation -lsd 30 --sample-rate 0.5 --impute-rate 0.5
- Pendulum regression
python run_experiment.py --dataset pendulum --task regression -lsd 30 --sample-rate 0.5
- USHCN interpolation
python run_experiment.py --dataset ushcn --task interpolation -lsd 10 --ts 0.3 --sample-rate 0.5 --unobserved-rate 0.2 --enc-var-activation square --dec-var-activation exp --trans-var-activation relu --grad-clip
- USHCN extrapolation
python run_experiment.py --dataset ushcn --task extrapolation -lsd 10 --ts 0.3 --sample-rate 0.5 --unobserved-rate 0.2 --enc-var-activation square --dec-var-activation exp --trans-var-activation relu --grad-clip
- Physionet interpolation
python run_experiment.py --dataset physionet --task interpolation -lsd 20 --ts 0.2 --enc-var-activation square --dec-var-activation exp --trans-var-activation relu --grad-clip --num-basis 20 --bandwidth 10
- Physionet extrapolation
python run_experiment.py --dataset physionet --task extrapolation -lsd 20 --ts 0.2 --enc-var-activation square --dec-var-activation exp --trans-var-activation relu --grad-clip --num-basis 20 --bandwidth 10 --cut-time 24
- CRU
python run_experiment.py --dataset pendulum --task regression -lsd 30 --sample-rate 0.5
- f-CRU
python run_experiment.py --dataset pendulum --task regression -lsd 30 --sample-rate 0.5 --f-cru --lr 5e-3
- RKN
python run_experiment.py --dataset pendulum --task regression -lsd 30 --sample-rate 0.5 --rkn
- RKN-Delta-t
python run_experiment.py --dataset pendulum --task regression -lsd 30 --sample-rate 0.5 --rkn --t-sensitive-trans-net
The CRU builts on the repository of RKN by Becker et al. (2019):
Preprocessing of USHCN has been adapted from Brouwer et al. (2019):
Preprocessing of Physionet has been adapted from Rubanova et al. (2019):
Modeling Irregular Time Series with Continuous Recurrent Units (CRUs) is open-sourced under the AGPL-3.0 license. See the LICENSE file for details.
For a list of other open source components included in Modeling Irregular Time Series with Continuous Recurrent Units (CRUs), see the file 3rd-party-licenses.txt.