[XPU] llama add xpu support #8282
33.78% of diff hit (target 80.00%)
View this Pull Request on Codecov
33.78% of diff hit (target 80.00%)
Annotations
Check warning on line 31 in paddlenlp/transformers/linear_utils.py
codecov / codecov/patch
paddlenlp/transformers/linear_utils.py#L30-L31
Added lines #L30 - L31 were not covered by tests
Check warning on line 37 in paddlenlp/transformers/linear_utils.py
codecov / codecov/patch
paddlenlp/transformers/linear_utils.py#L33-L37
Added lines #L33 - L37 were not covered by tests
Check warning on line 47 in paddlenlp/transformers/linear_utils.py
codecov / codecov/patch
paddlenlp/transformers/linear_utils.py#L42-L47
Added lines #L42 - L47 were not covered by tests
Check warning on line 49 in paddlenlp/transformers/linear_utils.py
codecov / codecov/patch
paddlenlp/transformers/linear_utils.py#L49
Added line #L49 was not covered by tests
Check warning on line 413 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L411-L413
Added lines #L411 - L413 were not covered by tests
Check warning on line 417 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L415-L417
Added lines #L415 - L417 were not covered by tests
Check warning on line 582 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L581-L582
Added lines #L581 - L582 were not covered by tests
Check warning on line 617 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L617
Added line #L617 was not covered by tests
Check warning on line 629 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L627-L629
Added lines #L627 - L629 were not covered by tests
Check warning on line 638 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L631-L638
Added lines #L631 - L638 were not covered by tests
Check warning on line 710 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L709-L710
Added lines #L709 - L710 were not covered by tests
Check warning on line 744 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L744
Added line #L744 was not covered by tests
Check warning on line 749 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L749
Added line #L749 was not covered by tests
Check warning on line 757 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L757
Added line #L757 was not covered by tests
Check warning on line 1439 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L1436-L1439
Added lines #L1436 - L1439 were not covered by tests
Check warning on line 1721 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L1720-L1721
Added lines #L1720 - L1721 were not covered by tests
Check warning on line 1727 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L1725-L1727
Added lines #L1725 - L1727 were not covered by tests
Check warning on line 1742 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L1742
Added line #L1742 was not covered by tests