Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[GCU] Support llama for GCU #8445

Merged
merged 1 commit into from
May 17, 2024

[GCU] Support llama for GCU

32d66ef
Select commit
Loading
Failed to load commit list.
Merged

[GCU] Support llama for GCU #8445

[GCU] Support llama for GCU
32d66ef
Select commit
Loading
Failed to load commit list.
Codecov / codecov/patch failed May 16, 2024 in 0s

36.00% of diff hit (target 80.00%)

View this Pull Request on Codecov

36.00% of diff hit (target 80.00%)

Annotations

Check warning on line 1212 in paddlenlp/generation/utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/generation/utils.py#L1212

Added line #L1212 was not covered by tests

Check warning on line 57 in paddlenlp/transformers/llama/fusion_ops.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/fusion_ops.py#L56-L57

Added lines #L56 - L57 were not covered by tests

Check warning on line 60 in paddlenlp/transformers/llama/fusion_ops.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/fusion_ops.py#L60

Added line #L60 was not covered by tests

Check warning on line 65 in paddlenlp/transformers/llama/fusion_ops.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/fusion_ops.py#L64-L65

Added lines #L64 - L65 were not covered by tests

Check warning on line 112 in paddlenlp/transformers/llama/fusion_ops.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/fusion_ops.py#L111-L112

Added lines #L111 - L112 were not covered by tests

Check warning on line 169 in paddlenlp/transformers/llama/fusion_ops.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/fusion_ops.py#L168-L169

Added lines #L168 - L169 were not covered by tests

Check warning on line 446 in paddlenlp/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/modeling.py#L446

Added line #L446 was not covered by tests

Check warning on line 480 in paddlenlp/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/modeling.py#L479-L480

Added lines #L479 - L480 were not covered by tests

Check warning on line 485 in paddlenlp/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/modeling.py#L485

Added line #L485 was not covered by tests

Check warning on line 487 in paddlenlp/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/transformers/llama/modeling.py#L487

Added line #L487 was not covered by tests

Check warning on line 128 in paddlenlp/utils/tools.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/utils/tools.py#L128

Added line #L128 was not covered by tests