Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
-
Updated
Jul 2, 2024 - Python
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
Clean baseline implementation of PPO using an episodic TransformerXL memory
Challenging Memory-based Deep Reinforcement Learning Agents
[Unofficial] PyTorch implementation of "Conformer: Convolution-augmented Transformer for Speech Recognition" (INTERSPEECH 2020)
Symbolic music generation taking inspiration from NLP and human composition process
YAI 11 x @POZAlabs : Improving & Evaluating Music Generation with ComMU
A lightweight PyTorch implementation of the Transformer-XL architecture proposed by Dai et al. (2019)
Simple from-scratch implementations of transformer-based models that match the state of the art.
Skriptejä kielimallin kouluttamiseksi ja puppugenerointiin
[ACL‘20] Highway Transformer: A Gated Transformer.
Implementation of Transformer-XL in Tensorflow 2.0.
2020 阿里云天池大数据竞赛-中医药文献问题生成挑战赛
End-to-end ASR/LM implementation with PyTorch
A Julia-based implementation of XLNet: A Generalized Autoregressive Pretraining for Language Understanding. < Flux | JuliaText >
Absolutely amazing SOTA Google Colab (Jupyter) Notebooks for creating/training SOTA Music AI models and for generating music with Transformer technology (Google XLNet/Transformer-XL)
XLNet for generating language.
Code Base for Transformer-XL on Finnish Language
Music and text generation with Transformer-XL.
Absolutely fantastic and fully working SOTA Transformer-XL Music AI implementation MahlerNet by Elias Lousseief
Add a description, image, and links to the transformer-xl topic page so that developers can more easily learn about it.
To associate your repository with the transformer-xl topic, visit your repo's landing page and select "manage topics."