Skip to content

fix codestyle

Codecov / codecov/patch failed Jun 12, 2024 in 1s

0.00% of diff hit (target 80.00%)

View this Pull Request on Codecov

0.00% of diff hit (target 80.00%)

Annotations

Check warning on line 221 in paddlenlp/experimental/transformers/bloom/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/bloom/modeling.py#L221

Added line #L221 was not covered by tests

Check warning on line 596 in paddlenlp/experimental/transformers/bloom/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/bloom/modeling.py#L596

Added line #L596 was not covered by tests

Check warning on line 275 in paddlenlp/experimental/transformers/chatglm/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/chatglm/modeling.py#L275

Added line #L275 was not covered by tests

Check warning on line 204 in paddlenlp/experimental/transformers/chatglm_v2/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/chatglm_v2/modeling.py#L204

Added line #L204 was not covered by tests

Check warning on line 18 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L18

Added line #L18 was not covered by tests

Check warning on line 28 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L28

Added line #L28 was not covered by tests

Check warning on line 34 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L33-L34

Added lines #L33 - L34 were not covered by tests

Check warning on line 39 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L39

Added line #L39 was not covered by tests

Check warning on line 1353 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L1351-L1353

Added lines #L1351 - L1353 were not covered by tests

Check warning on line 1382 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L1381-L1382

Added lines #L1381 - L1382 were not covered by tests

Check warning on line 1420 in paddlenlp/experimental/transformers/fused_transformer_layers.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/fused_transformer_layers.py#L1420

Added line #L1420 was not covered by tests

Check warning on line 200 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L200

Added line #L200 was not covered by tests

Check warning on line 299 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L299

Added line #L299 was not covered by tests

Check warning on line 312 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L312

Added line #L312 was not covered by tests

Check warning on line 345 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L345

Added line #L345 was not covered by tests

Check warning on line 635 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L635

Added line #L635 was not covered by tests

Check warning on line 650 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L650

Added line #L650 was not covered by tests

Check warning on line 677 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L677

Added line #L677 was not covered by tests

Check warning on line 684 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L684

Added line #L684 was not covered by tests

Check warning on line 697 in paddlenlp/experimental/transformers/generation_utils.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/generation_utils.py#L697

Added line #L697 was not covered by tests

Check warning on line 205 in paddlenlp/experimental/transformers/gpt/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/gpt/modeling.py#L205

Added line #L205 was not covered by tests

Check warning on line 348 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L348

Added line #L348 was not covered by tests

Check warning on line 436 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L436

Added line #L436 was not covered by tests

Check warning on line 829 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L829

Added line #L829 was not covered by tests

Check warning on line 149 in paddlenlp/experimental/transformers/opt/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/opt/modeling.py#L149

Added line #L149 was not covered by tests