Skip to content

v2.0

Compare
Choose a tag to compare
@mwalmsley mwalmsley released this 04 Apr 19:26
· 32 commits to main since this release
e6279c1

What's Changed

  • New pretrained architectures: ConvNeXT, EfficientNetV2, MaxViT, and more. Each in several sizes. Available on HuggingFace.
  • Reworked finetuning procedure. All these architectures are finetuneable through a common method.
  • Reworked finetuning options. Batch norm finetuning removed. Cosine schedule option added.
  • Reworked finetuning saving/loading. Auto-downloads encoder from HuggingFace.
  • Now supports regression finetuning (as well as multi-class and binary). See pytorch/examples/finetuning
  • Updated timm to 0.9.10, allowing latest model architectures. Previously downloaded checkpoints may not load correctly!
  • (internal until published) GZ Evo v2 now includes Cosmic Dawn (HSC H2O). Significant performance improvement on HSC finetuning. Also now includes GZ UKIDSS (dragged from our archives).
  • Updated pytorch to 2.1.0
  • Added support for webdatasets (only recommended for large-scale distributed training)
  • Improved per-question logging when training from scratch
  • Added option to compile encoder for max speed (not recommended for finetuning, only for pretraining).
  • Deprecates TensorFlow. The CS research community focuses on PyTorch and new frameworks like JAX.

Full Changelog: v1.0.5...v2.0