-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEA] jdk17 support #4103
Comments
After recent improvements like #5646 this may be largely working, and just need updates in the doc and continuous testing. Verified with: export JAVA_HOME=/usr/lib/jvm/java-17-openjdk-amd64
export SPARK_HOME=~/dist/spark-3.3.1-bin-hadoop3
export TEST_PARALLEL=0
export NUM_LOCAL_EXECS=2
mvn clean package -pl tests,integration_tests -am -Dbuildver=331 -Dskip
./integration_tests/run_pyspark_from_build.sh -k 'cast' -s Unit tests will need extra options for JVM
|
I just simply duplicated an internal jdk17 build pipeline from existing
it threw me
Looks like we may also need to add jvm options to allow access internal package for our build in jdk17 ENV please help update the instruction if we require building plugin using jdk17, thanks! |
This looks like a manifestation of the bug scala/bug#11814 (comment) However, we don't really yet have the requirement to build for a lower-version JDK using higher-version JDK, The same command for the 17 source, target, release is working JAVA_HOME=/usr/lib/jvm/java-17-openjdk-amd64 mvn -B clean package -Dbuildver=330 -Dmaven.compiler.release=17 -Dmaven.compiler.source=17 -Dmaven.compiler.target=17 -Dscala.plugin.version=4.6.1 -DallowConventionalDistJar=true |
Use new maven compiler (17) passed compiling part but failed UT, we may still require to add the extra args for UT running https://github.com/apache/spark/blob/master/pom.xml#L308-L325 Sorry, unassigned myself for now. I do not have time to work on the new instruction, it would be nice to have some developer to help make sure all commands work and update the doc first. Then I will setup pipeline to continually verify the recommended way to build and test w/ new java runtime, thanks! |
I will enable the regular run of JDK17 pipeline after #8234 |
Merged change, and enable regular nightly CI for jdk17 build and test (UT+IT) |
Is your feature request related to a problem? Please describe.
Spark 3.3 is adding support for jdk17 - https://issues.apache.org/jira/browse/SPARK-33772
Filing this just to track to see if we want to support the same versions.
Tasks
The text was updated successfully, but these errors were encountered: