-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Keras][Bugfix] fix a bug about alpha attribute in LeakyReLU which lead to passes conflict #14707
Conversation
Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.
Generated by tvm-bot |
@yongwww Could your willing to review this PR? Many thanks! |
Please add a regression test for it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it looks fine. Thanks for fixing this.
Please add a comment to indicate to people why things are gated on version revision.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thank you for this PR and fixing this issue.
@tvm-bot rerun |
The
alpha
attribute in LeakyReLU lacks exception checking. If thealpha=nan
, the keras frontend can convert it to relay ir successful, but in the optimization stage, it will trigger an unexpected crash and throwsTVMError: Observed 100 rewrite passes, possible conflicting passes?
The relay IR is the following:
This patch fixes the bug!
The StackTrace: