Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[INFER] llama&qwen2 A8W8 support skip_scale #8987

Closed
wants to merge 17 commits into from

fix confict

8da2246
Select commit
Loading
Failed to load commit list.
Closed

[INFER] llama&qwen2 A8W8 support skip_scale #8987

fix confict
8da2246
Select commit
Loading
Failed to load commit list.
Codecov / codecov/patch failed Sep 12, 2024 in 0s

0.00% of diff hit (target 80.00%)

View this Pull Request on Codecov

0.00% of diff hit (target 80.00%)

Annotations

Check warning on line 296 in paddlenlp/experimental/transformers/bloom/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/bloom/modeling.py#L296

Added line #L296 was not covered by tests

Check warning on line 380 in paddlenlp/experimental/transformers/chatglm/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/chatglm/modeling.py#L380

Added line #L380 was not covered by tests

Check warning on line 293 in paddlenlp/experimental/transformers/chatglm_v2/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/chatglm_v2/modeling.py#L293

Added line #L293 was not covered by tests

Check warning on line 20 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L20

Added line #L20 was not covered by tests

Check warning on line 406 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L405-L406

Added lines #L405 - L406 were not covered by tests

Check warning on line 409 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L408-L409

Added lines #L408 - L409 were not covered by tests

Check warning on line 640 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L635-L640

Added lines #L635 - L640 were not covered by tests

Check warning on line 647 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L642-L647

Added lines #L642 - L647 were not covered by tests

Check warning on line 649 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L649

Added line #L649 was not covered by tests

Check warning on line 655 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L655

Added line #L655 was not covered by tests

Check warning on line 662 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L662

Added line #L662 was not covered by tests

Check warning on line 665 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L664-L665

Added lines #L664 - L665 were not covered by tests

Check warning on line 674 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L673-L674

Added lines #L673 - L674 were not covered by tests

Check warning on line 681 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L681

Added line #L681 was not covered by tests

Check warning on line 688 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L687-L688

Added lines #L687 - L688 were not covered by tests

Check warning on line 695 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L695

Added line #L695 was not covered by tests

Check warning on line 703 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L703

Added line #L703 was not covered by tests

Check warning on line 706 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L705-L706

Added lines #L705 - L706 were not covered by tests

Check warning on line 709 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L708-L709

Added lines #L708 - L709 were not covered by tests

Check warning on line 712 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L711-L712

Added lines #L711 - L712 were not covered by tests

Check warning on line 717 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L714-L717

Added lines #L714 - L717 were not covered by tests

Check warning on line 724 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L719-L724

Added lines #L719 - L724 were not covered by tests

Check warning on line 778 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L777-L778

Added lines #L777 - L778 were not covered by tests

Check warning on line 1807 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L1802-L1807

Added lines #L1802 - L1807 were not covered by tests

Check warning on line 1814 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L1809-L1814

Added lines #L1809 - L1814 were not covered by tests