[FEA] Spark 3.2.0 reads Parquet unsigned int64(UINT64) as Decimal(20,0) but CUDF does not support it #3475
Labels
cudf_dependency
An issue or PR with this label depends on a new feature in cudf
feature request
New feature or request
P0
Must have for release
Spark 3.2+
Milestone
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I wish the RAPIDS Accelerator for Apache Spark would [...]
This is derived from issue #2741.
Now the max decimal is Decimal 64 bits, maximum of 18 numbers.
CUDF does not supports now, also Plugin should be updated after CUDF supports Decimal 128 bits.
Describe the solution you'd like
update evolveSchemaIfNeededAndClose in GpuParquetScan.scala and update the type checks logic.
Describe alternatives you've considered
Currently just fall back to CPU
The text was updated successfully, but these errors were encountered: