-
Notifications
You must be signed in to change notification settings - Fork 237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add DatabricksShimVersion to unshimmed class list [databricks] #4921
Conversation
Signed-off-by: Raza Jafri <rjafri@nvidia.com>
@pxLi PTAL |
build |
Signed-off-by: Raza Jafri <rjafri@nvidia.com>
build |
I'm confused by the description here. I don't think this is the right fix. it appears the issue is with classloader and having 2 versions of the DatabricksSparkVersion, I think it needs to be added to the classes we don't shim. We have com/nvidia/spark/rapids/SparkShimVersion and ShimVersion in there already. |
You are right, I wanted to put up this patch to unblock the CI last night. I will put up another PR |
This reverts commit 7782695. Signed-off-by: Raza Jafri <rjafri@nvidia.com>
Signed-off-by: Raza Jafri <rjafri@nvidia.com>
build |
Signed-off-by: Raza Jafri <rjafri@nvidia.com>
build |
Reverting changes to the DB SparkShims as part of the #4857. This change was originally envisioned to avoid referencing the ServiceProvider, but in case of DB we are not reusing the SparkShimImpl so we can directly reference it.Talked to @tgravescs about this and confirmed the correct fix is to "unshim" the
DatabricksShimVersion
that way the class loader will pick up the correct class.fixes #4918
Signed-off-by: Raza Jafri rjafri@nvidia.com