Skip to content

Source code of ICML'22 paper: FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting

License

Notifications You must be signed in to change notification settings

DAMO-DI-ML/ICML2022-FEDformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FEDformer (ICML 2022 paper)

  • Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, Rong Jin, "FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting," in Proc. 39th International Conference on Machine Learning (ICML 2022), Baltimore, Maryland, July 17-23, 2022. [paper]

Frequency Enhanced Decomposed Transformer (FEDformer) is more efficient than standard Transformer with a linear complexity to the sequence length. Our empirical studies with six benchmark datasets show that compared with state-of-the-art methods, FEDformer can reduce prediction error by 14.8% and 22.6% for multivariate and univariate time series, respectively.

This source code is also simultaneously updated in the Repo.

Frequency Enhanced Attention

Figure1
Figure 1. Overall structure of FEDformer
image image
Figure 2. Frequency Enhanced Block (FEB) Figure 3. Frequency Enhanced Attention (FEA)

Main Results

image

Get Started

  1. Install Python 3.6, PyTorch 1.9.0.
  2. Download data. You can obtain all the six benchmarks from [Autoformer] or [Informer].
  3. Train the model. We provide the experiment scripts of all benchmarks under the folder ./scripts. You can reproduce the experiment results by:
bash ./scripts/run_M.sh
bash ./scripts/run_S.sh

Citation

If you find this repo useful, please cite our paper.

@inproceedings{zhou2022fedformer,
  title={{FEDformer}: Frequency enhanced decomposed transformer for long-term series forecasting},
  author={Zhou, Tian and Ma, Ziqing and Wen, Qingsong and Wang, Xue and Sun, Liang and Jin, Rong},
  booktitle={Proc. 39th International Conference on Machine Learning (ICML 2022)},
  location = {Baltimore, Maryland},
  pages={},
  year={2022}
}

Further Reading

  • [Survey] Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, and Liang Sun. "Transformers in time series: A survey." arXiv preprint arXiv:2202.07125 (2022). [paper]
  • [Tutorial] Qingsong Wen, Linxiao Yang, Tian Zhou, Liang Sun, "Robust Time Series Analysis and Applications: An Industrial Perspective," in the 28th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (KDD 2022), Washington DC, USA, Aug. 14-18, 2022. [Website]

Contact

If you have any question or want to use the code, please contact tian.zt@alibaba-inc.com or maziqing.mzq@alibaba-inc.com .

Acknowledgement

We appreciate the following github repos a lot for their valuable code base or datasets:

https://github.com/thuml/Autoformer

https://github.com/zhouhaoyi/Informer2020

https://github.com/zhouhaoyi/ETDataset

https://github.com/laiguokun/multivariate-time-series-data

About

Source code of ICML'22 paper: FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published