- OU
- ARIMA (might AR)
- MLP
- MLP + MCCR Loss
- Bahdanau Attention
- Bahdanau Attention + MCCR
- Train/Valid : Loss between
y_hat
andy
(transformed)) - Test : Loss between
y_hat_inv
(inverse transformed) andy_raw
- XGBoost
- MLP
- MLP + MCCR
- LSTNet (Skip Layer)
- LSTNet (Skip Layer) + MCCR
- Transformer
- Transformer + MCCR
- Train/Valid : Loss between
y_hat
andy
(transformed)) - Test : Loss between
y_hat_inv
(inverse transformed) andy_raw
- PyTorch (>= 1.8.0)
- Optuna (>= 2.3.0)
- scikit-learn (>= 0.24.0)
- Run in Singularity (>= 3.6) container generated
torch-py3.def
- Run singularity with following commands
- bind paths (
/input
for input,/mnt/data
for output) - enable nvidia devices by
--nv
- containers refer
requirements.txt
generated from following command
$ poetry export -f requirements.txt --output requirements.txt --without-hashes
- bind paths (
- Running models in container (singularity) with commands
TORCH_IMG=my_torch_img.sif CASES=rnn_mul_lstnet_skip_mccr CASE_NAME=210818_LSTNet_MCCR mkdir -p /data/appleparan/"${CASE_NAME}" singularity exec --nv --bind "${HOME}"/input:/input:ro,/data/"${CASE_NAME}":/mnt/data:rw ${TORCH_IMG} python3 -m mise --dl ${CASES}
- LSTNet: arxiv link
- Lai, Guokun, et al. "Modeling long-and short-term temporal patterns with deep neural networks." The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. 2018.
- Transformer: arxiv link
- Zerveas, George, et al. "A Transformer-based Framework for Multivariate Time Series Representation Learning." arXiv preprint arXiv:2010.02803 (2020).
- MCCR (Correntropy based Loss): JLMR link
- Feng, Yunlong, et al. "Learning with the maximum correntropy criterion induced losses for regression." J. Mach. Learn. Res. 16.1 (2015): 993-1034.
- Jongsu Kim and Changhoon Lee. "Deep Particulate Matter Forecasting Model Using Correntropy-Induced Loss." arXiv preprint arXiv:2106.03032 (2021). link
- Accepted in Journal of Mechanical Science and Technology