Skip to content

Transformers 4.0.0 and pytorch-lightning 1.0.0 support

Compare
Choose a tag to compare
@minimaxir minimaxir released this 01 Dec 03:30
· 103 commits to master since this release
f7278bf

A release to fix breaking issues from both packages, with minor tweaks done in the meantime.

  • Minimum versions are now transformers>=4.0.0, pytorch-lightning>=1.0.8, and torch>=1.6.0, with fixed to breaking issues for all those major versions.
  • Tweaked generation to be more canonical with the newest implementation in transformers 4.
  • Set default refresh rate for training to 20 to make pytorch-lightning happy.
  • Set default learning rate for training to 1e-3 since I forgot why it was 1e-4.
  • Set both the default vocab size for tokenizers and the CPU config vocab size to 1000 tokens from 5000, since this allowed much easier/faster training in the demo.
  • Confirmed that setting fp16=True for GPU training with supported GPUs now works.

Future releases will add more explicit features. There may be extra console output in the meantime; will see what I can do to remove those.