-
Notifications
You must be signed in to change notification settings - Fork 237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] ShimLoader.updateSparkClassLoader fails with openjdk Java11 #3851
Comments
I have the same issue. I am running spark 3.2.0 on openjdk 11. I am unable to make nvidia spark rapids working |
@fcomte there are a few workarounds, if going to JDK8 is not an option, you can try |
thx @gerashegalov |
I think we can use this issue to implement a RP that automatically falls back on the orginal default spark.rapids.force.caller.classloader=false call path #5356 (comment) if we encounter |
This PR closes #4649, #3682. - add a doc for JDK9+ property with potential cross-compilation to lower Java major versions - `allowConventionalDistJar` for conventional jar creation when only one shim is included - automatically detect conventional jar and suppress warnings about classloader issues - remove manual configuration of javac args for scala-maven-plugin. This plugin processes `maven .compiler.*` and adds the right set of javac opts automatically. 3.4.x we use for db doesn't handle maven.compiler.release, however, we don't need it at the moment and it's currently not used. - disambiguate calls to buf.position() generating errors on JDK11 Note this PR only builds skipping running tests `-DskipTests` because we still need to address #3851 (comment) Signed-off-by: Gera Shegalov <gera@apache.org>
Fixes #3851 Spark loads external [datasources][1] using `Utils.getContextOrSparkClassLoader` Trampoline to `Utils.getContextOrSparkClassLoader` to make our current code work with external sources, and to unblock JDK9+ [1]: https://github.com/apache/spark/blob/b63674ea5f746306a96ab8c39c23a230a6cb9566/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala#L652 Signed-off-by: Gera Shegalov <gera@apache.org>
Describe the bug
Originally reported by @abellina
Steps/Code to reproduce bug
Set
JAVA_HOME
environment to an openjdk java11 location and run code with the defaultspark.rapids.force.caller.classloader=true
.Note this issue does not reproduce with the Microsoft Build of OpenJDK 11
Expected behavior
It should work.
Environment details (please complete the following information)
local dev environment
Additional context
N/A
The text was updated successfully, but these errors were encountered: