Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: Warn when from_pretrained misses PEFT keys #2118

Conversation

BenjaminBossan
Copy link
Member

After merging #2084, we now clean up the missing_keys when loading a PEFT adapter to remove all but the relevant keys (the fact that base model keys are missing is expected when loading a PEFT adapter).

Since the presence of missing_keys now really means that something might have gone wrong during loading, we can now warn the user if they call PeftModel.from_pretrained.

Note that load_adapter still does not warn, as here we return the load_result and users can already check, but for from_pretrained, they don't have that possibility.

After merging huggingface#2084, we now clean up the missing_keys when loading a
PEFT adapter to remove all but the relevant keys (the fact that base
model keys are missing is expected when loading a PEFT adapter).

Since the presence of missing_keys now really means that something might
have gone wrong during loading, we can now warn the user if they call
PeftModel.from_pretrained.

Note that load_adapter still does not warn, as here we return the
load_result and users can already check, but for from_pretrained, they
don't have that possibility.
@BenjaminBossan
Copy link
Member Author

See #2115 for more context.

Ping @yaswanth19 for awareness.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@BenjaminBossan BenjaminBossan requested a review from SunMarc October 1, 2024 10:25
Copy link
Member

@SunMarc SunMarc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this @BenjaminBossan ! This will be really helpful for users to confirm that everything has been loaded correctly. Left a question

src/peft/peft_model.py Show resolved Hide resolved
@BenjaminBossan BenjaminBossan merged commit d9d3059 into huggingface:main Oct 2, 2024
14 checks passed
@BenjaminBossan BenjaminBossan deleted the enh-warn-when-missing-peft-weight branch October 2, 2024 16:52
BenjaminBossan added a commit to BenjaminBossan/transformers that referenced this pull request Oct 10, 2024
When loading a LoRA adapter, so far, there was only a warning when there
were unexpected keys in the checkpoint. Now, there is also a warning
when there are missing keys.

This change is consistent with
huggingface/peft#2118 in PEFT and the planned PR
huggingface/diffusers#9622 in diffusers.

Apart from this change, the error message for unexpected keys was
slightly altered for consistency (it should be more readable now). Also,
besides adding a test for the missing keys warning, a test for
unexpected keys warning was also added, as it was missing so far.
BenjaminBossan added a commit to BenjaminBossan/peft that referenced this pull request Oct 22, 2024
After merging huggingface#2084, we now clean up the missing_keys when loading a
PEFT adapter to remove all but the relevant keys (the fact that base
model keys are missing is expected when loading a PEFT adapter).

Since the presence of missing_keys now really means that something might
have gone wrong during loading, we can now warn the user if they call
PeftModel.from_pretrained.

Note that load_adapter still does not warn, as here we return the
load_result and users can already check, but for from_pretrained, they
don't have that possibility.
ArthurZucker pushed a commit to huggingface/transformers that referenced this pull request Oct 24, 2024
When loading a LoRA adapter, so far, there was only a warning when there
were unexpected keys in the checkpoint. Now, there is also a warning
when there are missing keys.

This change is consistent with
huggingface/peft#2118 in PEFT and the planned PR
huggingface/diffusers#9622 in diffusers.

Apart from this change, the error message for unexpected keys was
slightly altered for consistency (it should be more readable now). Also,
besides adding a test for the missing keys warning, a test for
unexpected keys warning was also added, as it was missing so far.
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
When loading a LoRA adapter, so far, there was only a warning when there
were unexpected keys in the checkpoint. Now, there is also a warning
when there are missing keys.

This change is consistent with
huggingface/peft#2118 in PEFT and the planned PR
huggingface/diffusers#9622 in diffusers.

Apart from this change, the error message for unexpected keys was
slightly altered for consistency (it should be more readable now). Also,
besides adding a test for the missing keys warning, a test for
unexpected keys warning was also added, as it was missing so far.
BernardZach pushed a commit to innovationcore/transformers that referenced this pull request Dec 6, 2024
When loading a LoRA adapter, so far, there was only a warning when there
were unexpected keys in the checkpoint. Now, there is also a warning
when there are missing keys.

This change is consistent with
huggingface/peft#2118 in PEFT and the planned PR
huggingface/diffusers#9622 in diffusers.

Apart from this change, the error message for unexpected keys was
slightly altered for consistency (it should be more readable now). Also,
besides adding a test for the missing keys warning, a test for
unexpected keys warning was also added, as it was missing so far.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants