-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update transformers package #471
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Summary
Updates the transformers package minimum version requirement to fix compatibility with the NV-Embed-v2 model's LatentAttentionConfig implementation.
- Changed transformers version constraint from '>4.34.0,<=5.0' to '>=4.46.2,<=5.0' in
/libs/infinity_emb/pyproject.toml
to resolve AttributeError with LatentAttentionConfig - Maintains upper bound compatibility with transformers 5.0 while ensuring minimum version supports newer attention implementations
1 file(s) reviewed, no comment(s)
Edit PR Review Bot Settings | Greptile
Codecov ReportAll modified and coverable lines are covered by tests ✅
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## main #471 +/- ##
==========================================
- Coverage 79.57% 79.51% -0.06%
==========================================
Files 41 41
Lines 3417 3417
==========================================
- Hits 2719 2717 -2
- Misses 698 700 +2 ☔ View full report in Codecov by Sentry. |
Not sure if all users are happy with bumping the mandatory version! Maybe just update the .lock file.
|
Got it. Updated the lock file by running
Reverted |
Not sure if this makes nv-embed-2 work (https://huggingface.co/nvidia/NV-Embed-v2/blob/main/config.json) says transformer is 4.42.4, and this PR bumped 4.45 -> 4.46 |
Maybe |
Let me try that. However with the 0.45.2 version of the transformer package I got the error described in the issue. |
Related Issue
#470
Checklist