Skip to content

Kowsher/LLMMixer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM-Mixer: Multiscale Mixing in LLMs for Time Series Forecasting


🙋 Please let us know if you find out a mistake or have any suggestions!

🌟 If you find this resource helpful, please consider to star this repository and cite our research:

@article{kowsher2024llm,
  title={LLM-Mixer: Multiscale Mixing in LLMs for Time Series Forecasting},
  author={Kowsher, Md and Sobuj, Md Shohanur Islam and Prottasha, Nusrat Jahan and Alanis, E Alejandro and Garibay, Ozlem Ozmen and Yousefi, Niloofar},
  journal={arXiv preprint arXiv:2410.11674},
  year={2024}
}

Introduction

LLMMixer is an advanced framework designed to improve forecasting accuracy by integrating multiscale time series decomposition with the power of large language models (LLMs). By capturing both short-term and long-term temporal patterns, LLMMixer enhances the model's ability to understand complex trends, making it highly effective for time series forecasting tasks.

Requirements

Use Python 3.11 from MiniConda

  • torch==2.3.0
  • accelerate==0.33.0
  • einops==0.7.0
  • matplotlib==3.7.0
  • numpy==1.23.5
  • pandas==1.5.3
  • scikit_learn==1.2.2
  • scipy==1.12.0
  • tqdm==4.65.0
  • peft==0.12.0
  • transformers==4.44.0
  • deepspeed==0.15.1
  • sentencepiece==0.2.0

Get Started

  1. Install requirements. pip install -r requirements.txt
  2. Download data. You can download the all datasets from Google Driver, Baidu Driver or Kaggle Datasets. All the datasets are well pre-processed and can be used easily.
  3. Train the model by following the example of ./scripts.

Acknowledgement

We appreciate the following GitHub repos a lot for their valuable code and efforts.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published