Skip to content

This is the reference implementation of our NeurIPS 2023 paper "Add and Thin: Diffusion for Temporal Point Processes"

License

Notifications You must be signed in to change notification settings

davecasp/add-thin

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Add and Thin: Diffusion for Temporal Point Processes

This is the reference implementation of our NeurIPS 2023 paper Add and Thin: Diffusion for Temporal Point Processes.

Citation

If you build upon this work, please cite our paper as follows:

@inproceedings{luedke2023add,
    title={Add and Thin: Diffusion for Temporal Point Processes},
    author={David L{\"u}dke and Marin Bilo{\v{s}} and Oleksandr Shchur and Marten Lienen and Stephan G{\"u}nnemann},
    booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
    year={2023},
    url={https://openreview.net/forum?id=tn9Dldam9L}
}

Getting started

Basic requirements

The code has been tested on a cluster of Linux nodes using SLURM.
We cannot guarantee the functioning of the code if the following requirements are not met:

Installation

To properly install and run our code we recommend using a virtual environment (e.g., created via pyenv-virtualenv or conda).

The entire installation process consists of 3 steps. You can skip step 0 at you own "risk".

(Optional) Step 0: create a virtual environment

In the following we show how to create the environment via pyenv and pyenv-virtualenv. The steps are the following:

  • install pyenv (if you don't have it yet) by following the original guidelines;
  • install the correct Python version:
    pyenv install 3.10.4
  • create a virtual environment with the correct version of Python:
    pyenv virtualenv 3.10.4 add_thin

Step 1: clone the repository, change into it and (optional) activate the environment

This step allows you to download the code in your machine, move into the correct directory and (optional) activate the correct environment. The steps are the following:

  • clone the repository:
    git clone https://github.com/davecasp/add-thin.git
  • change into the repository:
    cd add-thin
  • (optional) activate the environment
    pyenv activate add_thin

Step 2: install the code as a local package

All the required packages are defined in the pyproject.toml file and can be easily installed via pip as following:

pip install -e .

Run code

Configuring experiments and running code for Add-Thin is done via hydra. If you are unfamiliar with how hydra works please check out the documentation.

Train model on paper configs

To run Add-Thin with the tuned hyperparameters for different datasets:

./train.py -m --config-name config_name

where config_name should be density_experiments_[1-4] or forecast_experiments_[1-4]. All seeds and datasets are scheduled as a gridsearch via the multirun flag.

Train model on self-set hyperparameter

To run Add-Thin with your own parameter:

./train.py 

where you are expected to set the parameter values either in the default configs or via command line flags.

Run hyperparameter sweep for one dataset

To run a hyperparameter sweep over the learning rate and number of mixture components:

./train.py -m --config-name hyperparameter_sweep data.name=data_set

where data_set can be each of the dataset names: hawkes2, reddit_politics_submissions, reddit_askscience_comments, yelp_mississauga, yelp_airport, taxi, nonstationary_renewal, pubg, twitter, stationary_renewal, self_correcting, nonstationary_poisson, hawkes1.

Evaluate metrics for trained model

A trained model can be evaluated against the test-set via either density notebook or forecast notebook.

About

This is the reference implementation of our NeurIPS 2023 paper "Add and Thin: Diffusion for Temporal Point Processes"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published