Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Dy2St] Optimize int(Value) and float(Value) error messages #68328

Merged
merged 3 commits into from
Sep 25, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 45 additions & 10 deletions python/paddle/pir/math_op_patch.py
Original file line number Diff line number Diff line change
Expand Up @@ -477,18 +477,53 @@ def _T_(self):
return _C_ops.transpose(self, perm)

def _int_(self):
raise TypeError(
"int(Value) is not supported in static graph mode. If you are using @to_static, you can try this:\n"
"1. If you want to get the value of Value, you can switch to non-fullgraph mode by setting @to_static(full_graph=True).\n"
"2. If you want to run it in full graph mode, you need use Value.astype(paddle.int32), and do not use int(Value)."
)
error_msg = """\
int(Tensor) is not supported in static graph mode. Because it's value is not available during the static mode.
It's usually triggered by the logging implicitly, for example:
>>> logging.info("The value of x is: {int(x)}")
^ `x` is Tensor, `int(x)` triggers int(Tensor)

There are two common workarounds available:
If you are logging Tensor values, then consider logging only at dynamic graphs, for example:

Modify the following code
>>> logging.info("The value of x is: {int(x)}")
to
>>> if paddle.in_dynamic_mode():
... logging.info("The value of x is: {int(x)}")

If you need to convert the Tensor type, for example:
Modify the following code
>>> x = int(x)
to
>>> x = x.astype("int64")
"""

raise TypeError(textwrap.dedent(error_msg))

def _float_(self):
raise TypeError(
"float(Value) is not supported in static graph mode. If you are using @to_static, you can try this:\n"
"1. If you want to get the value of Value, you can switch to non-fullgraph mode by setting @to_static(full_graph=True).\n"
"2. If you want to run it in full graph mode, you need use Value directly, and do not use float(Value)."
)
error_msg = """\
float(Tensor) is not supported in static graph mode. Because it's value is not available during the static mode.
It's usually triggered by the logging implicitly, for example:
>>> logging.info("The value of x is: {float(x)}")
^ `x` is Tensor, `float(x)` triggers float(Tensor)

There are two common workarounds available:
If you are logging Tensor values, then consider logging only at dynamic graphs, for example:

Modify the following code
>>> logging.info("The value of x is: {float(x)}")
to
>>> if paddle.in_dynamic_mode():
... logging.info("The value of x is: {float(x)}")

If you need to convert the Tensor type, for example:
Modify the following code
>>> x = float(x)
to
>>> x = x.astype("float64")
"""
raise TypeError(textwrap.dedent(error_msg))

def _bool_(self):
error_msg = """\
Expand Down