-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Suggestions] Updates suggested for helper.rescale_adapter_scale
#1989
[Suggestions] Updates suggested for helper.rescale_adapter_scale
#1989
Conversation
Thanks for working on this PR so quickly. Before going into details, I wanted to discuss something else that came to my mind, namely the argument name |
Would something like |
I agree using |
We have this in PEFT: peft/src/peft/tuners/lora/layer.py Line 276 in b926030
But as mentioned, I think "scale" is ambiguous, as it could be interpreted as the absolute value. However, I do like |
Renamed the function and the parameter. |
@BenjaminBossan should we add some warning if |
I'd prefer to add a sentence about this in the docstring until we know better what the issue really is, something along the lines of: "Warning: It has been reported that when using Apple's MPS backend for PyTorch, it is necessary to add a short sleep time after exiting the context before the scales are fully restored". At the moment, we don't even know if this replicates for other ARM Macs and all PyTorch versions. If it's an extreme edge case, the warning could do more harm than good. |
Ok, once @ariG23498 add this to the docstring, I have no more comments :) |
@ariG23498 Could you please merge with/rebase on the latest |
Apologies for the delay! I have rebased just now. |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks for the updates @ariG23498 and thanks @SimJeg for your valuable feedback.
Linked: #1940
CC: @BenjaminBossan