Support fused_attention_qkv for auto_parallel llama #8432
Merged
Codecov / codecov/project
failed
May 16, 2024 in 0s
55.42% (target 58.00%)
View this Pull Request on Codecov
55.42% (target 58.00%)
Details
Codecov Report
Attention: Patch coverage is 0%
with 6 lines
in your changes are missing coverage. Please review.
Project coverage is 55.42%. Comparing base (
53ad2da
) to head (5e2edde
).
Report is 6 commits behind head on develop.
Files | Patch % | Lines |
---|---|---|
paddlenlp/transformers/llama/modeling_auto.py | 0.00% | 6 Missing |
Additional details and impacted files
@@ Coverage Diff @@
## develop #8432 +/- ##
===========================================
- Coverage 55.43% 55.42% -0.01%
===========================================
Files 616 617 +1
Lines 96243 96283 +40
===========================================
+ Hits 53348 53366 +18
- Misses 42895 42917 +22
☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.
Loading