Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix test_compress_write_round_trip failure on Spark 3.3 #4847

Merged
merged 1 commit into from
Feb 23, 2022

Conversation

jlowe
Copy link
Member

@jlowe jlowe commented Feb 23, 2022

Fixes #4841. Spark's ParquetWriteSupport does not handle Spark configs that are missing from the Hadoop configuration, so we need to port the new config that was added in SPARK-38094. Note that this PR does not add full support for the new Parquet field ID configs that were added for that Spark change, and that work is tracked in #4846.

Signed-off-by: Jason Lowe <jlowe@nvidia.com>
@jlowe jlowe added this to the Feb 14 - Feb 25 milestone Feb 23, 2022
@jlowe jlowe self-assigned this Feb 23, 2022
@jlowe
Copy link
Member Author

jlowe commented Feb 23, 2022

build

@jlowe jlowe merged commit b3f1727 into NVIDIA:branch-22.04 Feb 23, 2022
@jlowe jlowe deleted the fix-parquet-test-330 branch February 23, 2022 19:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] test_compress_write_round_trip failed on Spark 3.3
2 participants