This repository contains common functionality for writing ML training loops. The goal is to make trainings loops short and readable (but moving common tasks to small libraries) without removing the flexibility required for research.
To get started, check out this Colab:
https://colab.research.google.com/github/google/CommonLoopUtils/blob/main/clu_synopsis.ipynb
If you're looking for usage examples, see:
https://github.com/google/flax/tree/main/examples
You can also find answers to common questions about CLU on Flax Github discussions page:
https://github.com/google/flax/discussions
Note: As this point we are not accepting contributions. Please fork the repository if you want to extend the libraries for your use case.