Skip to content

Anoise/WTFlib

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Awesome Wireless Traffic Forecasting Library

Python 3.10 PyTorch 1.12 License CC BY-NC-SA

Intoroduction

  • This is an open-source library for wireless traffic forecasting (WTF-Lib), especially for deep wireless traffic analysis.
  • In this library, we provide a variety of state-of-the-art short- and long-term wireless traffic forecasting methods.
  • We provide a neat code base to evaluate advanced deep time series models or develop your model, which given a strong benchmark.

Leaderboard for WTF

Till November 2023, the top three models for short- and long-term WTF tasks are:

Model
Ranking
Long-term
Forecasting
Short-term
Forecasting
Runtime
🥇 1st PSLinear PSLinear DLinear
🥈 2nd PatchTST PatchTST PSLinear
🥉 3rd GWNet FEDformer STID

Compared models of this leaderboard. ☑ means that their codes have already been included in this repo.

  • PSLinear - Progressively Supervision based on Label Decomposition: Towards Long-Term Wireless Traffic Forecasting on Large-Scale Graphs. [arXiv 2023] [Code].

  • PatchTST - A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. [ICLR 2023] [Code].

  • DLinear - Are Transformers Effective for Time Series Forecasting? [AAAI 2023] [Code].

  • GWNet - Graph WaveNet for Deep Spatial-Temporal Graph Modeling. [arXiv 2019][Code].

  • STID - Spatial-Temporal Identity: A Simple yet Effective Baseline for Multivariate Time Series Forecasting. [arXiv 2023][Code].

  • DyDgcrn - Dynamic Graph Convolutional Recurrent Network for Traffic Prediction: Benchmark and Solution. [[IEEE TKDE 2021]] (https://ieeexplore.ieee.org/document/9625773) [Code].

  • Mvstgn - MVSTGN: A Multi-View Spatial-Temporal Graph Network for Cellular Traffic Prediction. [[IEEE TMC 2021]] (https://ieeexplore.ieee.org/document/9625773) [Code].

  • Periodformer - Does Long-Term Series Forecasting Need Complex Attention and Extra Long Inputs? [[arXiv 2023]] (https://arxiv.org/abs/2306.05035) [Code].

  • FEDformer - FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting. [ICML 2022] [Code].

  • Pyraformer - Pyraformer: Low-complexity Pyramidal Attention for Long-range Time Series Modeling and Forecasting. [ICLR 2022] [Code].

  • Autoformer - Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. [NeurIPS 2021] [Code].

  • Informer - Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. [AAAI 2021] [Code].

  • Reformer - Reformer: The Efficient Transformer. [ICLR 2020] [Code].

  • STCNet - Deep Transfer Learning for Intelligent Cellular Traffic Prediction Based on Cross-Domain Big Data. [IEEE JSAC 2019] [Code]

  • Transformer - Attention is All You Need. [NeurIPS 2017] [Code].

Long-term Wireless Traffic Forecasting (LWTF) Benchmark

Methods Metric C2TM Milano-All CBSD
4 5 6 7 8 24 36 48 72 24 36 48 72
PSLinear-MS MSE 9.057 9.161 9.298 9.431 9.389 0.633 0.775 0.907 1.184 1.641 1.772 1.822 1.955
MAE 0.171 0.174 0.171 0.317 0.169 0.249 0.269 0.285 0.301 0.625 0.644 0.646 0.660
PSLinear-STL MSE 9.084 9.183 9.328 9.447 9.400 0.708 0.870 1.062 1.366 1.644 1.759 1.821 1.965
MAE 0.174 0.172 0.172 0.169 0.167 0.275 0.294 0.315 0.343 0.633 0.649 0.653 0.669
PatchTST MSE 9.226 9.34 9.509 9.504 9.561 0.662 0.832 0.929 1.222 1.87 2.036 2.113 2.084
MAE 0.195 0.197 0.201 0.182 0.192 0.254 0.273 0.282 0.302 0.706 0.736 0.735 0.697
STID MSE 9.213 9.769 9.625 9.925 9.56 0.73 0.84 0.985 1.248 2.066 2.038 1.986 2.416
MAE 0.375 0.339 0.353 0.509 0.387 0.367 0.382 0.377 0.404 0.744 0.738 0.706 0.797
Periodformer MSE 11.584 11.663 11.822 11.883 11.818 1.341 1.457 1.583 1.892 5.139 5.491 5.335 5.515
MAE 1.001 0.98 0.975 0.938 0.923 0.571 0.539 0.542 0.54 1.213 1.251 1.231 1.235
DLinear MSE 12.702 12.58 12.99 13.457 12.945 0.898 1.031 1.106 1.338 2.414 2.576 2.584 2.7
MAE 0.516 0.464 0.494 0.456 0.556 0.359 0.37 0.392 0.403 0.781 0.805 0.806 0.815
FEDformer MSE 11.587 11.631 11.679 11.763 11.544 1.836 1.836 1.961 2.274 6.048 6.368 6.474 6.434
MAE 0.969 0.943 0.914 0.881 0.825 0.734 0.734 0.762 0.772 1.424 1.469 1.472 1.474
Autoformer MSE 11.68 11.721 11.828 11.94 11.832 1.796 2.652 2.571 2.617 6.107 6.345 6.089 6.078
MAE 1.03 1.01 0.994 0.981 0.951 0.733 0.999 0.955 0.898 1.453 1.472 1.432 1.412
Informer MSE 9.401 9.513 9.634 9.894 9.724 20.268 20.236 20.292 20.394 15.156 15.125 15.196 15.219
MAE 0.581 0.581 0.582 0.67 0.577 2.137 2.111 2.098 2.079 2.365 2.362 2.37 2.372
Transformer MSE 9.373 9.474 9.612 9.796 9.712 20.049 20.019 20.113 20.243 14.622 15.097 15.137 15.156
MAE 0.584 0.583 0.582 0.639 0.58 1.988 1.948 1.941 1.906 2.339 2.351 2.352 2.392
Mvstgn* MSE 9.087 9.195 9.321 9.433 9.395 1.372 1.443 2.656 2.928 3.001 2.442 3.013 3.442
MAE 0.352 0.36 0.357 0.329 0.337 0.524 0.49 0.559 0.731 0.813 0.783 0.861 0.907
GWNet* MSE 9.178 9.288 9.431 9.443 9.42 1.21 1.603 2.151 3.413 1.893 2.06 2.074 2.225
MAE 0.61 0.617 0.624 0.409 0.412 0.48 0.526 0.564 0.573 0.714 0.76 0.739 0.754
Dgcrn* MSE 9.086 9.198 9.398 9.527 9.478 - - - - - - - -
MAE 0.464 0.458 0.577 0.559 0.539 - - - - - - - -

Note that the method marked by * connot process the LWTF tasks, which are reimplemented by ourselves.

Dataset

Here is a summary of supported datasets.

  1. C2TM

  2. Milano

Usage

  1. Install Python 3.8. For convenience, execute the following command.
pip install -r requirements.txt
  1. Prepare Data. You can obtain the well pre-processed datasets from [Google Drive][Tsinghua Cloud], or [Baidu Drive], Then place the downloaded data in the folder./dataset.

  2. Train and evaluate model. We provide the experiment scripts for all benchmarks under the folder ./scripts/. You can reproduce the experiment results as the following examples:

# C2TM
bash run_c2tm.sh

# Milano-All
bash run_milano_all.sh

# Milano-SMS
bash run_milano_sms.sh

# Milano-Call
bash run_milano_call.sh

# Milano-Internet
bash run_milano_internet.sh
  1. Develop your own model.
  • Add the model file to the folder ./models. You can follow the ./models/Transformer.py.
  • Include the newly added model in the Exp_Basic.model_dict of ./exp/exp_basic.py.
  • Create the corresponding scripts under the folder ./scripts.

Visualization

Place figures here.

Citation

If you find this repo useful, please cite our paper.

Come soon!

About

Awesome Wireless Traffic Forecasting Library

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published