Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug fix] prevent squashing pair u8 dequantize -> s8 quantize #39346

Merged
merged 13 commits into from
Feb 14, 2022

Conversation

sfraczek
Copy link
Contributor

@sfraczek sfraczek commented Feb 3, 2022

PR types

Bug fixes

PR changes

Others

Describe

  • Added code that prevents squashing quantize->dequantize pairs if first is uint8 and second is int8.
  • Added fix for post training quantization checking for unsigned output. FC and Conv2d have different attribute names for fused activation.

@paddle-bot-old
Copy link

paddle-bot-old bot commented Feb 3, 2022

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@sfraczek sfraczek requested review from wozna and lidanqing-intel and removed request for wozna February 3, 2022 14:09
@sfraczek sfraczek requested a review from wozna February 3, 2022 18:01
@lidanqing-intel lidanqing-intel changed the title prevent squashing pair u8 dequantize -> s8 quantize [Bug fix] prevent squashing pair u8 dequantize -> s8 quantize Feb 7, 2022
@sfraczek
Copy link
Contributor Author

sfraczek commented Feb 7, 2022

please do not merge yet I want to add unit tests, there seems to be a perf regression

if (dequant_in->inputs[0]->IsOp()) {
auto prev_op = dequant_in->inputs[0]->Op();
std::string act_name;
std::cout << "prev_op->Type(): " << prev_op->Type() << std::endl;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please don't use std::cout.

@sfraczek
Copy link
Contributor Author

I was comparing to wrong baseline. There shouldn't be a problem with this PR.

Copy link
Contributor

@baoachun baoachun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@wozna wozna left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job, LGTM

@jczaja jczaja self-requested a review February 14, 2022 09:18
Copy link
Contributor

@jczaja jczaja left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jczaja jczaja merged commit 66b5348 into PaddlePaddle:develop Feb 14, 2022
@baoachun
Copy link
Contributor

Hi @sfraczek, QA found that this pr caused the test_quant2_int8_lstm_mkldnn single test execution to fail. Chould you please take a look?
https://xly.bce.baidu.com/paddlepaddle/paddle/newipipe/detail/4797233/job/11924197
图片

@sfraczek
Copy link
Contributor Author

Thank you very much for pointing this out, I will investigate this test. I am already looking into what this PR caused. I made one follow up fix to an error that was not caught by our unit tests in #39593.

@sfraczek
Copy link
Contributor Author

I have reverted the change that caused this PTQ lstm problem on pr #39593.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants