Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Throw SparkArrayIndexOutOfBoundsException for Spark 3.3.0+ #4464

Merged
merged 6 commits into from
Jan 7, 2022

Conversation

razajafri
Copy link
Collaborator

@razajafri razajafri commented Jan 5, 2022

Throw SparkArrayIndexOutOfBoundsException when elementAt() or getArrayItem() tries to access out of bounds

Fixes #4039

Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@razajafri razajafri self-assigned this Jan 5, 2022
@razajafri
Copy link
Collaborator Author

build

Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@razajafri
Copy link
Collaborator Author

build

@sameerz sameerz added the bug Something isn't working label Jan 6, 2022
@sameerz sameerz added this to the Dec 13 - Jan 7 milestone Jan 6, 2022
…d class directly

Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@razajafri
Copy link
Collaborator Author

build

Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@razajafri
Copy link
Collaborator Author

build

Copy link
Member

@jlowe jlowe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GpuElementAt should also use this new exception shim, as Spark is using the same query execption method there.

Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@razajafri
Copy link
Collaborator Author

GpuElementAt should also use this new exception shim, as Spark is using the same query execption method there.

Sorry, I should've caught that. I have added tests for element_at and threw the new exception

@razajafri
Copy link
Collaborator Author

build

Copy link
Member

@jlowe jlowe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just some copyright issues, but otherwise I think this is good

@@ -23,6 +23,7 @@ import ai.rapids.cudf.{ColumnView, GroupByAggregation, GroupByOptions, Scalar}
import com.nvidia.spark.rapids.{GpuBinaryExpression, GpuColumnVector, GpuComplexTypeMergingExpression, GpuLiteral, GpuMapUtils, GpuScalar, GpuUnaryExpression}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2022 copyrights

@@ -19,7 +19,7 @@ package org.apache.spark.sql.rapids
import ai.rapids.cudf.ColumnVector
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2022 copyrights

Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@razajafri
Copy link
Collaborator Author

build

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Spark 3.3.0 IT Array test failures
4 participants