Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove aggregator dependency before deploying dist artifact #4265

Merged

Conversation

gerashegalov
Copy link
Collaborator

@gerashegalov gerashegalov commented Dec 2, 2021

  • remove aggregator before publishing to maven repo
  • package reduced pom with the jar in META-INF/maven
  • update deploy with the new location of the reduced pom
  • inconsistency of the pom installed into the local .m2 remains due to install-file bug until we refactor install as a separate step. But this inconsistency is the same as in 21.10
  • missing space version-def.sh

Fixes #4253

Signed-off-by: Gera Shegalov gera@apache.org

Fixes NVIDIA#4253

Signed-off-by: Gera Shegalov <gera@apache.org>
@gerashegalov gerashegalov self-assigned this Dec 2, 2021
@gerashegalov gerashegalov added the build Related to CI / CD or cleanly building label Dec 2, 2021
@gerashegalov gerashegalov added this to the Nov 30 - Dec 10 milestone Dec 2, 2021
@gerashegalov
Copy link
Collaborator Author

build

@@ -23,7 +23,7 @@ nvidia-smi

ARTF_ROOT="$WORKSPACE/jars"
MVN_GET_CMD="mvn org.apache.maven.plugins:maven-dependency-plugin:2.8:get -B \
-Dmaven.repo.local=$WORKSPACE/.m2 \
-Dmaven.repo.local=$WORKSPACE/.m2 -Dtransitive=false \
Copy link
Collaborator

@NvTimLiu NvTimLiu Dec 2, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This fix would be good for our test script jenkins/spark-tests.sh, but there should be a problem on the customers' end, as they may add rapids-4-spark_2.12 in there pom.xml if without adding transitive=false, them they will get these dependencies error.
Would it doable to remove the dependency tree [dependency reduce] from rapids-4-spark_2.12, similar to what we did on branch-21.10? https://github.com/NVIDIA/spark-rapids/pull/3930/files#diff-2681b51a40a6a26d3a6f7f839cd810422226193ef273ea02d1b0b93c3a87cfa6R1868

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, the dependencies as published seem problematic for users who want to pull in the dist jar for building their RAPIDS accelerated UDFs or custom code that accesses ColumnarRdd.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have recently litigated it on other PRs: transitive provided dependencies are irrelevant for normal use https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope. We should not optimize for dependency:get mojo that behaves different by default. IMO

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But in this case it's just wrong. The dependency states that the jar needs the aggregator jar to be provided, but it contains it instead. What's the point of advertising a bogus dependency? It clearly causes problems for some use cases, and is misleading at best if someone examines the pom.

Copy link
Collaborator Author

@gerashegalov gerashegalov Dec 2, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I disagree: provided is the weakest scope that still serves the purpose that the aggregator dependency tree is required for building the dist jar (which it is, especially given that some branches like spark312db are built remotely). The next weakest dependency scope is runtime stating that the dependency will be needed only at the execution phase, and using that might be misleading.

What's the point of advertising a bogus dependency?

disagree with the characterization as bogus because the aggregator is required for building dist package. According to the transitive dependency rules: if the application depends with any scope on dist and dist depends with provided on aggregator, the dependency subtree rooted in aggregator is specifically not advertised.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO this violates the spirit of the shading we're doing in two ways.

  1. Most shaded jars do not advertise the dependencies they used during builds when they pull in those artifacts. That's why there's a dependency-reduced pom built by default by the shade plugin. What we're doing here is effectively a form of shading, so I do not understand why we would want to advertise the details of a dependency that is irrelevant once the artifact is published. What's the use-case where having this dependency in the published artifact is desired? Maybe I'm missing that.

  2. I do not think we should reference unpublished dependencies in the poms of our published artifacts, even if they are marked provided. The user does not need to provide it to use the artifact, and the fact that this is a build-time only dependency is not clearly conveyed by the pom as-is.

Bottom line is that this is breaking some workflows as it is now. Sure, we control this particular workflow and can update it as a workaround, but do we really want to explain this all over again to a user that is doing something similar? I think it's a bit difficult to argue why we need to list a dependency in this pom as provided when it's not actually needed and then didn't bother to publish a dependency-reduced pom to avoid such problems.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have tried manual install-file with dependency-reduced pom 372bc13#diff-2681b51a40a6a26d3a6f7f839cd810422226193ef273ea02d1b0b93c3a87cfa6R375

The issue with that is a bug in the install-file mojo when executed inside a multi-module build it's picking a random (based on the build order of the current maven session) instead of the current pom. I question whether it's worth solving a simple problem affecting this edge case because even a small change is more disruptive #4043 (comment) than the problem it addresses.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated the PR based on the offline meeting with @jlowe

@pxLi
Copy link
Collaborator

pxLi commented Dec 3, 2021

Would this be better to target branch-21.12?

Signed-off-by: Gera Shegalov <gera@apache.org>
@gerashegalov gerashegalov changed the base branch from branch-22.02 to branch-21.12 December 3, 2021 03:13
@gerashegalov gerashegalov changed the title Use intransitive dependency:get for artifacts Remove aggregator dependency before deploying dist artifact Dec 3, 2021
@NvTimLiu
Copy link
Collaborator

NvTimLiu commented Dec 3, 2021

Got build error with this PR

19:45:30  [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:3.0.0:run (create-parallel-world) on project rapids-4-spark_2.12: An Ant BuildException has occured: The following error occurred while executing this line:
19:45:30  [ERROR] /home/jenkins/agent/workspace/jenkins-tim-rapids-release-7/dist/maven-antrun/build-parallel-worlds.xml:125: exec returned: 255
19:45:30  [ERROR] around Ant part ...<ant antfile="/home/jenkins/agent/workspace/jenkins-tim-rapids-release-7/dist/maven-antrun/build-parallel-worlds.xml" target="build-parallel-worlds" />... @ 9:156 in /home/jenkins/agent/workspace/jenkins-tim-rapids-release-7/dist/target/antrun/build-main.xml
19:45:30  [ERROR] -> [Help 1]
19:45:30  [ERROR] 
19:45:30  [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
19:45:30  [ERROR] Re-run Maven using the -X switch to enable full debug logging.
19:45:30  [ERROR] 
19:45:30  [ERROR] For more information about the errors and possible solutions, please read the following articles:
19:45:30  [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
19:45:30  [ERROR] 
19:45:30  [ERROR] After correcting the problems, you can resume the build with the command
19:45:30  [ERROR]   mvn <goals> -rf :rapids-4-spark_2.12

@gerashegalov
Copy link
Collaborator Author

build

@gerashegalov
Copy link
Collaborator Author

gerashegalov commented Dec 3, 2021

The error is from a line number preceding the change in the PR (executing binary-dedupe), probably unrelated but checking.
UPDATE: binary-dedupe flagged:

03:45:30  [INFO]      [exec] com/nvidia/spark/rapids/ExplainPlanImpl.class is not bitwise-identical across shims

spark320 sha1 is different:

tmp-sha1-class.txt:d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark301/com/nvidia/spark/rapids/ExplainPlanImpl.class
tmp-sha1-class.txt:d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark302/com/nvidia/spark/rapids/ExplainPlanImpl.class
tmp-sha1-class.txt:d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark303/com/nvidia/spark/rapids/ExplainPlanImpl.class
tmp-sha1-class.txt:d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark311/com/nvidia/spark/rapids/ExplainPlanImpl.class
tmp-sha1-class.txt:d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark311cdh/com/nvidia/spark/rapids/ExplainPlanImpl.class
tmp-sha1-class.txt:d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark312/com/nvidia/spark/rapids/ExplainPlanImpl.class
tmp-sha1-class.txt:22aee3f24a0fa81ce3e26acc42641b520cfa1e03 *./parallel-world/spark320/com/nvidia/spark/rapids/ExplainPlanImpl.class

When building locally it, the build is correct

unzip -p aggregator/target/spark320/rapids-4-spark-aggregator_2.12-21.12.0-SNAPSHOT.jar com/nvidia/spark/rapids/ExplainPlanImpl.class | sha1sum
d256e7b9639b757d13e650e96eaee1850ec05f13

@gerashegalov
Copy link
Collaborator Author

Inspecting the CI workspace points to the Databricks-vs-Other-Shims split brain problem.

$ grep ExplainPlanImpl.class tmp-sha1*.txt
d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark301/com/nvidia/spark/rapids/ExplainPlanImpl.class
22aee3f24a0fa81ce3e26acc42641b520cfa1e03 *./parallel-world/spark301db/com/nvidia/spark/rapids/ExplainPlanImpl.class
d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark302/com/nvidia/spark/rapids/ExplainPlanImpl.class
d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark303/com/nvidia/spark/rapids/ExplainPlanImpl.class
d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark311/com/nvidia/spark/rapids/ExplainPlanImpl.class
d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark311cdh/com/nvidia/spark/rapids/ExplainPlanImpl.class
d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark312/com/nvidia/spark/rapids/ExplainPlanImpl.class
22aee3f24a0fa81ce3e26acc42641b520cfa1e03 *./parallel-world/spark312db/com/nvidia/spark/rapids/ExplainPlanImpl.class
d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark320/com/nvidia/spark/rapids/ExplainPlanImpl.class

$ grep ExplainPlanImpl.class tmp-sha1*.txt | sort -u -k1,1
22aee3f24a0fa81ce3e26acc42641b520cfa1e03 *./parallel-world/spark301db/com/nvidia/spark/rapids/ExplainPlanImpl.class
d256e7b9639b757d13e650e96eaee1850ec05f13 *./parallel-world/spark301/com/nvidia/spark/rapids/ExplainPlanImpl.class

The jars being combined were built at different commits, c8dd501 and 6fd125d respectively

$ unzip -p deps/rapids-4-spark-aggregator_2.12-21.12.0-spark312db.jar rapids4spark-version-info.properties
version=21.12.0
cudf_version=21.12.0-SNAPSHOT
user=ubuntu
revision=c8dd50132676feeccd7e4908e8f7937b97c62c87
branch=HEAD
date=2021-11-30T11:46:24Z
url=https://github.com/NvTimLiu/spark-rapids.git

vs

$ unzip -p deps/rapids-4-spark-aggregator_2.12-21.12.0-spark320.jar rapids4spark-version-info.propertiesversion=21.12.0
cudf_version=21.12.0-SNAPSHOT
user=
revision=6fd125d70d204a5402a0853a367d9f4f99ed8a22
branch=HEAD
date=2021-12-03T11:01:38Z
url=https://github.com/NvTimLiu/spark-rapids.git

@NvTimLiu
Copy link
Collaborator

NvTimLiu commented Dec 4, 2021

Oh, I see, we need to build out DB shims before the spark-rapids release.

Is this diff c8dd501 and 6fd125d respectively the reason for the error build-parallel-worlds.xml:125: exec returned: 255 ?

Let me update the DB shims using the same commit Id with the Other-Shims.

@gerashegalov
Copy link
Collaborator Author

Oh, I see, we need to build out DB shims before the spark-rapids release.

Is this diff c8dd501 and 6fd125d respectively the reason for the error build-parallel-worlds.xml:125: exec returned: 255 ?

yes, it's because of the commit mismatch because I verified that the the shas match and the build succeeds once the nightly artifacts are caught up. Somehow we had an an inconsistent set of artifacts in the internal maven repo, which I could reproduce by skipping the real build but just building -pl dist for noSnapshotsWithDatabricks without the dependency modules (-am). Some hours later, next nightly build finished and the artifacts became consistent again.

I think there is a simple solution to the problem that totally prevents this from happening. When databricks is involved, all artifacts (shims) should be built on databricks nodes. Then the build pipeline can fan out into testing stages and only databricks-related tests are run on databricks, where as the rest runs on non-databricks fleet. We should not fan-out for artifact creation as we do today.

I'll also file a bug to surface the error message from binary-dedupe.sh to maven better.

@NvTimLiu
Copy link
Collaborator

NvTimLiu commented Dec 5, 2021

Updated DB shims using the same commit Id with the nightly ones, build PASS, test PASS.

Build: tim-databricks-release-urm #1 rapids-release #90

Test: REL_it-A30-AQE/19/consoleFull

11:39:26  + mvn org.apache.maven.plugins:maven-dependency-plugin:2.8:get -B -Dmaven.repo.local=/home/jenkins/agent/workspace/REL_it-A30-AQE/.m2 -s jenkins/settings.xml -P mirror-apache-to-urm -Ddest=/home/jenkins/agent/workspace/REL_it-A30-AQE/jars -DremoteRepositories=https://oss.sonatype.org/content/repositories/staging -DgroupId=com.nvidia -DartifactId=rapids-4-spark_2.12 -Dversion=21.12.0
11:39:27  [INFO] Scanning for projects...
11:39:28  [INFO] ------------------------------------------------------------------------
11:39:28  [INFO] Reactor Build Order:
11:39:28  [INFO] 
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark Root Project                   [pom]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark SQL Plugin                     [jar]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark Shuffle Plugin                 [jar]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark SQL Plugin Shims               [pom]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.0.1 Shim    [jar]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark Scala UDF Plugin               [jar]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark Aggregator                     [jar]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark Distribution                   [jar]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark UDF Examples                   [jar]
11:39:28  [INFO] RAPIDS Accelerator for Apache Spark Tests                          [jar]
11:39:28  [INFO] rapids-4-spark-integration-tests_2.12                              [jar]
11:39:28  [INFO] rapids-4-spark-api-validation                                      [jar]
11:39:28  [INFO] 
11:39:28  [INFO] ------------------< com.nvidia:rapids-4-spark-parent >------------------
11:39:28  [INFO] Building RAPIDS Accelerator for Apache Spark Root Project 21.12.0-SNAPSHOT [1/12]
11:39:28  [INFO] --------------------------------[ pom ]---------------------------------
11:39:28  [INFO] 
11:39:28  [INFO] --- maven-dependency-plugin:2.8:get (default-cli) @ rapids-4-spark-parent ---
11:39:29  [INFO] Resolving com.nvidia:rapids-4-spark_2.12:jar:21.12.0 with transitive dependencies
11:39:29  [INFO] Downloading from apache-snapshots-repo: *****/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.pom
11:39:31  [INFO] Downloading from temp: https://oss.sonatype.org/content/repositories/staging/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.pom
11:39:31  [INFO] Downloaded from temp: https://oss.sonatype.org/content/repositories/staging/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.pom (16 kB at 27 kB/s)
11:39:31  [INFO] Downloading from apache-snapshots-repo: *****/com/nvidia/rapids-4-spark-parent/21.12.0/rapids-4-spark-parent-21.12.0.pom
11:39:32  [INFO] Downloading from temp: https://oss.sonatype.org/content/repositories/staging/com/nvidia/rapids-4-spark-parent/21.12.0/rapids-4-spark-parent-21.12.0.pom
11:39:32  [INFO] Downloaded from temp: https://oss.sonatype.org/content/repositories/staging/com/nvidia/rapids-4-spark-parent/21.12.0/rapids-4-spark-parent-21.12.0.pom (65 kB at 222 kB/s)
11:39:32  [INFO] Downloading from apache.snapshots.https: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml
11:39:32  [INFO] Downloading from apache-snapshots-repo: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml
11:39:32  [INFO] Downloading from snapshots: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml
11:39:32  [INFO] Downloading from apache.snapshots: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml
11:39:32  [INFO] Downloaded from apache.snapshots.https: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml (1.6 kB at 44 kB/s)
11:39:32  [INFO] Downloaded from apache-snapshots-repo: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml (1.6 kB at 44 kB/s)
11:39:32  [INFO] Downloaded from apache.snapshots: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml (1.6 kB at 43 kB/s)
11:39:32  [INFO] Downloading from temp: https://oss.sonatype.org/content/repositories/staging/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml
11:39:32  [INFO] Downloaded from snapshots: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml (1.6 kB at 41 kB/s)
11:39:32  [INFO] Downloading from snapshots-repo: https://oss.sonatype.org/content/repositories/snapshots/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml
11:39:32  [INFO] Downloaded from snapshots-repo: https://oss.sonatype.org/content/repositories/snapshots/ai/rapids/cudf/21.12.0-SNAPSHOT/maven-metadata.xml (1.4 kB at 8.7 kB/s)
11:39:32  [INFO] Downloading from apache-snapshots-repo: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/cudf-21.12.0-20211204.082229-68.pom
11:39:32  [INFO] Downloaded from apache-snapshots-repo: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/cudf-21.12.0-20211204.082229-68.pom (23 kB at 830 kB/s)
11:39:32  [INFO] Downloading from apache-snapshots-repo: *****/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar
11:39:32  [INFO] Downloading from apache-snapshots-repo: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/cudf-21.12.0-20211204.082229-68-cuda11.jar
11:39:33  [INFO] Downloading from temp: https://oss.sonatype.org/content/repositories/staging/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar
11:39:35  [INFO] Downloaded from temp: https://oss.sonatype.org/content/repositories/staging/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar (34 MB at 17 MB/s)
11:39:39  [INFO] Downloaded from apache-snapshots-repo: *****/ai/rapids/cudf/21.12.0-SNAPSHOT/cudf-21.12.0-20211204.082229-68-cuda11.jar (329 MB at 56 MB/s)
11:39:39  [WARNING] destination/dest parameter is deprecated: it will disappear in future version.
11:39:39  [INFO] Copying /home/jenkins/agent/workspace/REL_it-A30-AQE/.m2/com/nvidia/rapids-4-spark_2.12/21.12.0/rapids-4-spark_2.12-21.12.0.jar to /home/jenkins/agent/workspace/REL_it-A30-AQE/jars
11:39:39  [WARNING] Notice transitive dependencies won't be copied.
11:39:39  [INFO] ------------------------------------------------------------------------
11:39:39  [INFO] Reactor Summary for RAPIDS Accelerator for Apache Spark Root Project 21.12.0-SNAPSHOT:
11:39:39  [INFO] 
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark Root Project ... SUCCESS [ 10.762 s]
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark SQL Plugin ..... SKIPPED
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark Shuffle Plugin . SKIPPED
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark SQL Plugin Shims SKIPPED
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.0.1 Shim SKIPPED
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark Scala UDF Plugin SKIPPED
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark Aggregator ..... SKIPPED
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark Distribution ... SKIPPED
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark UDF Examples ... SKIPPED
11:39:39  [INFO] RAPIDS Accelerator for Apache Spark Tests .......... SKIPPED
11:39:39  [INFO] rapids-4-spark-integration-tests_2.12 .............. SKIPPED
11:39:39  [INFO] rapids-4-spark-api-validation ...................... SKIPPED
11:39:39  [INFO] ------------------------------------------------------------------------
11:39:39  [INFO] BUILD SUCCESS
11:39:39  [INFO] ------------------------------------------------------------------------
11:39:39  [INFO] Total time:  11.035 s
11:39:39  [INFO] Finished at: 2021-12-05T03:39:38Z
11:39:39  [INFO] ------------------------------------------------------------------------

@NvTimLiu
Copy link
Collaborator

NvTimLiu commented Dec 6, 2021

@gerashegalov Could you please help to check if we're good to merge?

@GaryShen2008 GaryShen2008 merged commit 3e27acc into NVIDIA:branch-21.12 Dec 6, 2021
@GaryShen2008
Copy link
Collaborator

Merge it since test passed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
build Related to CI / CD or cleanly building
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Dependencies missing of spark-rapids v21.12.0 release jars
5 participants