Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] buildall uses a stale aggregator dependency during test compilation #4053

Closed
gerashegalov opened this issue Nov 8, 2021 · 0 comments · Fixed by #4054
Closed

[BUG] buildall uses a stale aggregator dependency during test compilation #4053

gerashegalov opened this issue Nov 8, 2021 · 0 comments · Fixed by #4054
Assignees
Labels
bug Something isn't working build Related to CI / CD or cleanly building

Comments

@gerashegalov
Copy link
Collaborator

gerashegalov commented Nov 8, 2021

Originally reported by @revans2
Describe the bug
#4029 fix to rely exclusively on the package phase to avoid concurrent installs to the local maven repo causes the test compile in the second single-threaded phase of buildall resuming from dist https://github.com/NVIDIA/spark-rapids/blob/branch-21.12/build/buildall#L184 to pull non-existent or stale aggrgeator artifact during test compile.

This is because the maven invocation resuming from dist can no longer see dependencies from the first phase unless they are installed. The dist build itself was not affected by this because we bypass Maven when picking up aggregators for the final parallel worlds assembly https://github.com/NVIDIA/spark-rapids/blob/branch-21.12/dist/maven-antrun/build-parallel-worlds.xml#L73.

Tests should not rely on aggregator #3932. But this we need a workaround in the meantime.

Steps/Code to reproduce bug
Remove you local m2 cache or point maven.repo.local to an empty dir
run

./build/buildall
Failed to execute goal on project rapids-4-spark_2.12: Could not resolve dependencies for project com.nvidia:rapids-4-spark_2.12:jar:21.12.0-SNAPSHOT: Could not find artifact com.nvidia:rapids-4-spark-aggregator_2.12:jar:spark301:21.12.0-SNAPSHOT in snapshots-repo (https://oss.sonatype.org/content/repositories/snapshots)

Expected behavior
should build correctly. We can leave one shim build thread using install without any risk of concurrent install to the local maven repo.

Environment details (please complete the following information)
local dev

Additional context
#3932

@gerashegalov gerashegalov added bug Something isn't working ? - Needs Triage Need team to review and classify labels Nov 8, 2021
@gerashegalov gerashegalov self-assigned this Nov 8, 2021
@gerashegalov gerashegalov added build Related to CI / CD or cleanly building and removed ? - Needs Triage Need team to review and classify labels Nov 8, 2021
@gerashegalov gerashegalov added this to the Nov 1 - Nov 12 milestone Nov 8, 2021
gerashegalov added a commit that referenced this issue Nov 8, 2021
Make sure that the minimum spark version classifier of the aggregator (typically 301) is `install`ed  

Closes #4053 

Signed-off-by: Gera Shegalov <gera@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working build Related to CI / CD or cleanly building
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant