You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Despite the fact that our build is attempting to enforce JDK 8 as its target via maven.compiler.source/target=1.8, when the user builds using JDK 11 the build succeeds but the Plugin fails at run time.
java.lang.NoSuchMethodError: java.nio.ByteBuffer.mark()Ljava/nio/ByteBuffer;
E at com.nvidia.spark.rapids.MetaUtils$.buildTableMeta(MetaUtils.scala:112)
E at com.nvidia.spark.rapids.MetaUtils$.buildTableMeta(MetaUtils.scala:61)
E at com.nvidia.spark.rapids.MetaUtils$.buildTableMeta(MetaUtils.scala:51)
Steps/Code to reproduce bug
Build with JDK 11
and use JAVA_HOME=/path/to/jdk8 spark-shell ... to launch Spark with spark-rapids
Expected behavior
Either
Transparently handle this using a special Maven profile and maven.compiler.release. Fail build if user's Maven version does not support it
Or fail build if we don't want the users to use later JDK's for building until it's supported.
Environment details (please complete the following information)
local dev
Additional context
originally reported by @NVnavkumar
The text was updated successfully, but these errors were encountered:
This PR closes#4649, #3682.
- add a doc for JDK9+ property with potential cross-compilation to lower Java major versions
- `allowConventionalDistJar` for conventional jar creation when only one shim is included
- automatically detect conventional jar and suppress warnings about classloader issues
- remove manual configuration of javac args for scala-maven-plugin. This plugin processes `maven .compiler.*` and adds the right set of javac opts automatically. 3.4.x we use for db doesn't handle maven.compiler.release, however, we don't need it at the moment and it's currently not used.
- disambiguate calls to buf.position() generating errors on JDK11
Note this PR only builds skipping running tests `-DskipTests` because we still need to address #3851 (comment)
Signed-off-by: Gera Shegalov <gera@apache.org>
Describe the bug
Despite the fact that our build is attempting to enforce JDK 8 as its target via maven.compiler.source/target=1.8, when the user builds using JDK 11 the build succeeds but the Plugin fails at run time.
Steps/Code to reproduce bug
Build with JDK 11
and use
JAVA_HOME=/path/to/jdk8 spark-shell ...
to launch Spark with spark-rapidsExpected behavior
Either
maven.compiler.release
. Fail build if user's Maven version does not support itEnvironment details (please complete the following information)
Additional context
originally reported by @NVnavkumar
The text was updated successfully, but these errors were encountered: