Skip to content

Latest commit

 

History

History
154 lines (110 loc) · 5.17 KB

README.rst

File metadata and controls

154 lines (110 loc) · 5.17 KB

Model Zoo

Summary: This is a non-exhaustive list of internal and external pretrained models. Internal pretrained models are stored on Zeonodo.



Note: RLMs are typically task-agnostic models which are used at the state of Preprocessing (which is often incorrectly called "pretraining" phase). They are usually trained using Self-Supervised Learning, but Supervised, (narrow-sense) Unsupervised Learning and Reinforcement Learning can also be used. The end goal is to extract dense representations / features / embeddings. This is typically trained at an external dataset (i.e. source dataset) whihch is larger than the one for the Downstream Task (i.e. target dataset).

Logo

ViT

BEiT

DeiT 3

BERT

GPT-J (GPT-3 open-source alternative)

HuBERT

ViLBERT

MMFT-BERT

ViLT

AV-HuBERT

Note: DTMs are typically task-specific models which are used at the Downstream Task. They are usually trained using Supervised Learning, but Self-Supervised, (narrow-sense) Unsupervised Learning and Reinforcement Learning can also be used. The end goal depends on the specific task. The DTMs listed below are only the ones compatible with dense-representation RLMs (e.g. Transformer-based encoders).

Logo