-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support Cambricon MLUs device #1687
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @huismiling
Thanks so much for adding this ! I left one comment with respect to backward compatibility, what do you think?
src/peft/utils/other.py
Outdated
if version.parse(accelerate.__version__) >= version.parse("0.29.0"): | ||
from accelerate.utils import is_mlu_available | ||
else: | ||
is_mlu_available = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@BenjaminBossan @younesbelkada Thanks for your work. Will this be Ok ?
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
@huismiling Could you please run |
@BenjaminBossan |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks. @younesbelkada feel free to merge if you also approve.
Out of curiosity @huismiling, did you manage to run PEFT successfully on Cambricon MLUs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding the support for Cambricon MLU devices in PEFT !
@BenjaminBossan Currently, I work for Cambricon. It is pleasure to @me for MLUs issues. |
So you could run the test suite successfully on MLUs? |
yeah, I ran There are some assertion errors. Most of errors is like the following. This difference should be acceptable. |
Okay, so it's mostly about tolerances. If you manage to find more acceptable tolerance values, feel free to open a PR. Ideally, we would have all tests passing, so that we can more easily detect regressions. |
Currently,Transformers and Accelerate have supported cambricon mlu (huggingface/transformers#29627, huggingface/accelerate#2552).
This PR enables users to leverage the cambricon mlu for training and inference of peft models.