- Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, Rong Jin, "FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting," in Proc. 39th International Conference on Machine Learning (ICML 2022), Baltimore, Maryland, July 17-23, 2022. [paper]
Frequency Enhanced Decomposed Transformer (FEDformer) is more efficient than standard Transformer with a linear complexity to the sequence length. Our empirical studies with six benchmark datasets show that compared with state-of-the-art methods, FEDformer can reduce prediction error by 14.8% and 22.6% for multivariate and univariate time series, respectively.
This source code is also simultaneously updated in the Repo.
Figure 1. Overall structure of FEDformer |
Figure 2. Frequency Enhanced Block (FEB) | Figure 3. Frequency Enhanced Attention (FEA) |
- Install Python 3.6, PyTorch 1.9.0.
- Download data. You can obtain all the six benchmarks from [Autoformer] or [Informer].
- Train the model. We provide the experiment scripts of all benchmarks under the folder
./scripts
. You can reproduce the experiment results by:
bash ./scripts/run_M.sh
bash ./scripts/run_S.sh
If you find this repo useful, please cite our paper.
@inproceedings{zhou2022fedformer,
title={{FEDformer}: Frequency enhanced decomposed transformer for long-term series forecasting},
author={Zhou, Tian and Ma, Ziqing and Wen, Qingsong and Wang, Xue and Sun, Liang and Jin, Rong},
booktitle={Proc. 39th International Conference on Machine Learning (ICML 2022)},
location = {Baltimore, Maryland},
pages={},
year={2022}
}
- [Survey] Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, and Liang Sun. "Transformers in time series: A survey." arXiv preprint arXiv:2202.07125 (2022). [paper]
- [Tutorial] Qingsong Wen, Linxiao Yang, Tian Zhou, Liang Sun, "Robust Time Series Analysis and Applications: An Industrial Perspective," in the 28th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (KDD 2022), Washington DC, USA, Aug. 14-18, 2022. [Website]
If you have any question or want to use the code, please contact tian.zt@alibaba-inc.com or maziqing.mzq@alibaba-inc.com .
We appreciate the following github repos a lot for their valuable code base or datasets:
https://github.com/thuml/Autoformer
https://github.com/zhouhaoyi/Informer2020
https://github.com/zhouhaoyi/ETDataset
https://github.com/laiguokun/multivariate-time-series-data