An Open-Source Framework for Paramter Efficient Tuning (Delta Tuning).
Overview • Installation • Basic Usage • Docs • Performance •
OpenDelta is a toolkit for parameter efficient methods (we dub it as delta tuning), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most paramters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.
-
Our repo is tested on Python 3.8 and PyTorch 1.9.0. Lower version may also be supported.
- 2022.03.24 We notice several bugs in Soft Prompt Tuning and Prefix Tuning, mainly due to their need to customize attention ids, token_type_ids, we are fixing it! Currently, please use the other methods since they are stabler and better in performance.
- 2022.03.20 Add a colab example to illustrate efficient training and space-saving multitask-serving.
- 2022.03.20 A new pip version released.
- 2022.02.16 Support regular expression in named-based addressing.
create a virtualenv (optional)
conda create -n opendelta_env python=3.8
conda activate opendelta_env
Install OpenDelta using pip as follows:
pip install opendelta
To play with the latest features, you can also install OpenDelta from the source.
git clone https://github.com/thunlp/OpenDelta.git
cd OpenDelta
python setup.py install
python setup.py develop
from transformers import AutoModelForSeq2SeqLM
t5 = AutoModelForSeq2SeqLM.from_pretrained("t5-base")
from opendelta import AutoDeltaModel
delta = AutoDeltaModel.from_finetuned("DeltaHub/lora_t5-base_mrpc", backbone_model=t5)
delta.log()
-
You can try to use OpenDelta on any backbone models based on PyTorch.
-
However, with small chances thatThe interface of the submodules of the backbone model is not supported. Therefore we verified some commonly used models that OpenDelta are sure to support.
-
We will keep testing more and more emerging models.
-
Pull requests are welcomed when you successfully apply OpenDelta on your own backbone model.
Lora | Bias Tuning |
Adapter Houstbly |
Adapter Preffier |
Adapter Drop |
Adapater Low-Rank |
Compactor | Prefix Tuning |
Prompt Tuning |
|
---|---|---|---|---|---|---|---|---|---|
T5 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
GPT-2 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
BART | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
DistilBERT | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
RoBERTa | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |
BERT | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
T5-3b(parallel) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
Deberta-v2 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ||
CTRL | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ||
ViT | ✅ |
Google sheet here
Subject to change at any moment.