You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If a given node has a parameter value specified as int (like 'param': 123) instead of float (like 'param':123.),
if is passed into the kernel as integer type. However, if the CUDA kernel specifies that the input argument is a floating point variable (like double), the parameter value will be set to cast to floating type with value 0.
In my opinion, in NDComponent __init__, params_dict should enforce a uniform dtype on the parameter values.
Or if there is a way to check the expected dtype from the cuda kernel in the pre_run stage and, if inconsistent between params_dict and the expected dtypes from the kernel, send a warning and convert to the one required by the kernel.
The text was updated successfully, but these errors were encountered:
If a given node has a parameter value specified as
int
(like'param': 123
) instead offloat
(like'param':123.
),if is passed into the kernel as integer type. However, if the CUDA kernel specifies that the input argument is a floating point variable (like
double
), the parameter value will be set to cast to floating type with value 0.In my opinion, in NDComponent
__init__
,params_dict
should enforce a uniform dtype on the parameter values.Or if there is a way to check the expected dtype from the cuda kernel in the
pre_run
stage and, if inconsistent betweenparams_dict
and the expected dtypes from the kernel, send a warning and convert to the one required by the kernel.The text was updated successfully, but these errors were encountered: