Skip to content

1.1.1 C-MTEB. PL-MTEB, Multi-GPU

Compare
Choose a tag to compare
@Muennighoff Muennighoff released this 20 Sep 15:31
· 1653 commits to main since this release

Updates

  • πŸ‡¨πŸ‡³ C-MTEB was released and integrated thanks to @staoxiao. Check out the paper here. Together with C-MTEB, the team also released other great embedding resources such as new SoTA models on MTEB & C-MTEB called BGE, as well as datasets and source code πŸš€
  • πŸ‡΅πŸ‡± PL-MTEB & BEIR-PL was released and integrated thanks to @rafalposwiata & @kwojtasi. Check out the new leaderboard tab for PL-MTEB: https://huggingface.co/spaces/mteb/leaderboard. Some BEIR-PL datasets are still missing and will be added soon cc @kwojtasi πŸ˜‡
  • πŸ’» Clarifications on multi-GPU: Native multi-GPU support for Retrieval thanks to @NouamaneTazi. We also added a clarification in the README on how any task can be run in a multi-GPU setup without requiring any changes in MTEB. MTEB abstracts the way the encodings are produced. Whether users use multiple or a single GPU in the encode function is completely flexible 😊

What's Changed

New Contributors

Full Changelog: 1.1.0...1.1.1