Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] AQE does not work with Spark 3.2 due to unrecognized GPU partitioning #3384

Closed
andygrove opened this issue Sep 3, 2021 · 2 comments
Closed
Assignees
Labels
bug Something isn't working Spark 3.2+

Comments

@andygrove
Copy link
Contributor

Describe the bug

This CostBasedOptimizerSuite sample failure shows that Spark does not recognize GPU partitioning. This is probably not related to CBO and is probably showing up because this test does not disable AQE and most of our tests do.

- Avoid transition to GPU for trivial projection after CPU SMJ *** FAILED ***
  java.lang.IllegalStateException: Unexpected partitioning for coalesced shuffle read: gpuhashpartitioning(strings#7, 200)
  at org.apache.spark.sql.execution.adaptive.AQEShuffleReadExec.outputPartitioning$lzycompute(AQEShuffleReadExec.scala:86)
  at org.apache.spark.sql.execution.adaptive.AQEShuffleReadExec.outputPartitioning(AQEShuffleReadExec.scala:55)
  at org.apache.spark.sql.execution.exchange.ValidateRequirements$.$anonfun$validateInternal$4(ValidateRequirements.scala:61)
  at org.apache.spark.sql.execution.exchange.ValidateRequirements$.$anonfun$validateInternal$4$adapted(ValidateRequirements.scala:59)

Steps/Code to reproduce bug
Run CostBasedOptimizerSuite

Expected behavior
Tests should pass

Environment details (please complete the following information)
N/A

Additional context
N/A

@andygrove andygrove added bug Something isn't working ? - Needs Triage Need team to review and classify Spark 3.2+ labels Sep 3, 2021
@andygrove andygrove changed the title [BUG] Cost-based optimizer not compatible with Spark 3.2 [BUG] AQE does not work with Spark 3.2 due to unrecognized GPU partitioning Sep 3, 2021
@andygrove
Copy link
Contributor Author

@Salonijain27 Salonijain27 removed the ? - Needs Triage Need team to review and classify label Sep 7, 2021
@andygrove andygrove added this to the Aug 30 - Sept 10 milestone Sep 8, 2021
@andygrove
Copy link
Contributor Author

This is now resolved

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Spark 3.2+
Projects
None yet
Development

No branches or pull requests

3 participants