Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] ShimLoader.updateSparkClassLoader fails with openjdk Java11 #3851

Closed
gerashegalov opened this issue Oct 18, 2021 · 4 comments · Fixed by #5646
Closed

[BUG] ShimLoader.updateSparkClassLoader fails with openjdk Java11 #3851

gerashegalov opened this issue Oct 18, 2021 · 4 comments · Fixed by #5646
Assignees
Labels
bug Something isn't working

Comments

@gerashegalov
Copy link
Collaborator

gerashegalov commented Oct 18, 2021

Describe the bug
Originally reported by @abellina

21/10/18 23:18:41 INFO ShimLoader: Updating spark classloader jdk.internal.loader.ClassLoaders$AppClassLoader@5ffd2b27 with the URLs:
 jar:file:/home/gshegalov/gits/NVIDIA/spark-rapids/dist/target/rapids-4-spark_2.12-21.12.0-SNAPSHOT.jar!/spark3xx-common/, jar:file:/
home/gshegalov/gits/NVIDIA/spark-rapids/dist/target/rapids-4-spark_2.12-21.12.0-SNAPSHOT.jar!/spark320/
21/10/18 23:18:41 ERROR CoarseGrainedExecutorBackend: Executor self-exiting due to : Unable to create executor due to No such method:
 addURL() on object: jdk.internal.loader.ClassLoaders$AppClassLoader
java.lang.NoSuchMethodException: No such method: addURL() on object: jdk.internal.loader.ClassLoaders$AppClassLoader
        at org.apache.commons.lang3.reflect.MethodUtils.invokeMethod(MethodUtils.java:231)
        at org.apache.commons.lang3.reflect.MethodUtils.invokeMethod(MethodUtils.java:184)
        at com.nvidia.spark.rapids.ShimLoader$.$anonfun$updateSparkClassLoader$3(ShimLoader.scala:216)

Steps/Code to reproduce bug

Set JAVA_HOME environment to an openjdk java11 location and run code with the default spark.rapids.force.caller.classloader=true.

Note this issue does not reproduce with the Microsoft Build of OpenJDK 11

Expected behavior
It should work.

Environment details (please complete the following information)
local dev environment

Additional context
N/A

@gerashegalov gerashegalov added bug Something isn't working ? - Needs Triage Need team to review and classify and removed ? - Needs Triage Need team to review and classify labels Oct 18, 2021
@fcomte
Copy link

fcomte commented Dec 1, 2021

I have the same issue. I am running spark 3.2.0 on openjdk 11. I am unable to make nvidia spark rapids working

@gerashegalov
Copy link
Collaborator Author

@fcomte there are a few workarounds, if going to JDK8 is not an option, you can try --conf spark.rapids.force.caller.classloader=false that uses a different mechanism, which works well in many situations.

@fcomte
Copy link

fcomte commented Dec 13, 2021

thx @gerashegalov --conf spark.rapids.force.caller.classloader=false works

@gerashegalov
Copy link
Collaborator Author

gerashegalov commented May 13, 2022

I think we can use this issue to implement a RP that automatically falls back on the orginal default spark.rapids.force.caller.classloader=false call path #5356 (comment) if we encounter java.lang.NoSuchMethodException on newer JDKs

gerashegalov added a commit that referenced this issue May 20, 2022
This PR closes #4649, #3682.

- add a doc for JDK9+ property with potential cross-compilation to lower Java major versions
- `allowConventionalDistJar` for conventional jar creation when only one shim is included
- automatically detect conventional jar and suppress warnings about classloader issues 
- remove manual configuration of javac args for scala-maven-plugin.  This plugin processes `maven .compiler.*` and adds the right set of javac opts automatically. 3.4.x we use for db doesn't handle maven.compiler.release, however, we don't need it at the moment and it's currently not used.
- disambiguate calls to buf.position() generating errors on JDK11

Note this PR only builds skipping running tests `-DskipTests` because we still need to address #3851 (comment) 
  
Signed-off-by: Gera Shegalov <gera@apache.org>
@gerashegalov gerashegalov self-assigned this Aug 16, 2022
@gerashegalov gerashegalov added this to the Aug 8 - Aug 19 milestone Aug 16, 2022
gerashegalov added a commit that referenced this issue Aug 16, 2022
Fixes #3851

Spark loads external [datasources][1] using `Utils.getContextOrSparkClassLoader`

Trampoline to `Utils.getContextOrSparkClassLoader`  to make our current code work with external sources, and to unblock JDK9+ 

[1]:  https://github.com/apache/spark/blob/b63674ea5f746306a96ab8c39c23a230a6cb9566/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala#L652 

Signed-off-by: Gera Shegalov <gera@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants