-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEA] Plugin should throw same arithmetic exceptions as Spark #5196
Comments
Our plugin currently throw def divByZeroError(): Nothing = {
throw new ArithmeticException("divide by zero")
}
def divOverflowError(): Nothing = {
throw new ArithmeticException("Overflow in integral divide.")
} |
Yes. As noted in the issue description:
There's already a |
Could we use |
We already do, see |
This comment was marked as resolved.
This comment was marked as resolved.
Could you please get the exceptions that Spark is throwing that we want to match, including the stack trace? From reading the code it looks like the error checking for Spark in the case of NaN and Inf is happening inside It appears that it will throw a regular |
Test on branch 22.08, using Spark 330 The difference occurs when we test these 2 cases :
Spark will throw
Stack trace of spark-rapids:
|
For these 2 test cases:
Both Spark and rapids throws
Stack trace of rapids:
|
Close this issue as all subtasks has been done. Feel free to reopen it if anyone some other opinions. |
Is your feature request related to a problem? Please describe.
This is a follow-on to #5182 where we made some exception checks less specific due to changes in Spark 3.3 causing test failures.
We really should throw the same arithmetic exceptions as Spark and these vary by Spark version.
Describe the solution you'd like
Update
arithmetic_ops_test.py
andlogic_test.py
to check for specific exceptionsjava.lang.ArithmeticException
vsorg.apache.spark.SparkArithmeticException
.Describe alternatives you've considered
None
Additional context
This is a checklist of all the tests we need to update.
test_mod_pmod_by_zero
(Plugin should throw same arithmetic exceptions as Spark part1 #5354)test_unary_minus_ansi_overflow
(assertMinValueOverflow
should throw same type of exception as Spark #5783)test_abs_ansi_overflow
(assertMinValueOverflow
should throw same type of exception as Spark #5783)test_floor_ceil_overflow
(Make the error message of changing decimal type the same as Spark's [databricks] #5915)test_div_by_zero_ansi
(Plugin should throw same arithmetic exceptions as Spark part1 #5354)test_div_overflow_exception_when_ansi
(Plugin should throw same arithmetic exceptions as Spark part1 #5354)test_unary_minus_ansi_overflow_day_time_interval
(assertMinValueOverflow
should throw same type of exception as Spark #5783)test_abs_ansi_overflow_day_time_interval
(assertMinValueOverflow
should throw same type of exception as Spark #5783)test_add_overflow_with_ansi_enabled_day_time_interval
(UpdateGpuAdd
andGpuSubtract
to throw same type exception as Spark #5949)test_subtraction_overflow_with_ansi_enabled_day_time_interval
(UpdateGpuAdd
andGpuSubtract
to throw same type exception as Spark #5949)test_day_time_interval_division_overflow
(Update the interval division to throw same type exceptions as Spark #6019)test_logical_with_side_effect
(UpdateGpuAdd
andGpuSubtract
to throw same type exception as Spark #5949)test_add_overflow_with_ansi_enabled
(Updatetest_add_overflow_with_ansi_enabled
andtest_subtraction_overflow_with_ansi_enabled
to check the exception type for Integral case. #6071)test_subtraction_overflow_with_ansi_enabled
(Updatetest_add_overflow_with_ansi_enabled
andtest_subtraction_overflow_with_ansi_enabled
to check the exception type for Integral case. #6071)The text was updated successfully, but these errors were encountered: