Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] failures parquet_test test:read_round_trip #3195

Closed
tgravescs opened this issue Aug 11, 2021 · 2 comments · Fixed by #3203
Closed

[BUG] failures parquet_test test:read_round_trip #3195

tgravescs opened this issue Aug 11, 2021 · 2 comments · Fixed by #3203
Assignees
Labels
bug Something isn't working P0 Must have for release

Comments

@tgravescs
Copy link
Collaborator

A bunch of these failures: 08:01:21 FAILED integration_tests/src/main/python/parquet_test.py::test_read_round_trip[parquet-reader_confs2-read_parquet_sql-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]

08:01:21 �[1m�[31mE Caused by: org.apache.spark.SparkUpgradeException: You may get a different result due to the upgrading of Spark 3.0: writing dates before 1582-10-15 or timestamps before 1900-01-01T00:00:00Z into Parquet INT96 files can be dangerous, as the files may be read by Spark 2.x or legacy versions of Hive later, which uses a legacy hybrid calendar that is different from Spark 3.0+'s Proleptic Gregorian calendar. See more details in SPARK-31404. You can set spark.sql.legacy.parquet.int96RebaseModeInWrite to 'LEGACY' to rebase the datetime values w.r.t. the calendar difference during writing, to get maximum interoperability. Or set spark.sql.legacy.parquet.int96RebaseModeInWrite to 'CORRECTED' to write the datetime values as it is, if you are 100% sure that the written files will only be read by Spark 3.0+ or other systems that use Proleptic Gregorian calendar.�[0m

@tgravescs tgravescs added bug Something isn't working ? - Needs Triage Need team to review and classify labels Aug 11, 2021
@jlowe jlowe added the P0 Must have for release label Aug 11, 2021
@jlowe jlowe self-assigned this Aug 11, 2021
@pxLi pxLi changed the title [BUG] Dataproc build failures parquet_test test:read_round_trip [BUG] build failures parquet_test test:read_round_trip Aug 12, 2021
@pxLi
Copy link
Collaborator

pxLi commented Aug 12, 2021

this failed in nightly tests w/ spark standalone 3.1.1 and 3.1.2

Caused by: org.apache.spark.SparkException: Task failed while writing rows.
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:296)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$15(FileFormatWriter.scala:210)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.scheduler.Task.run(Task.scala:131)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
[2021-08-12T07:04:13.862Z] E                   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[2021-08-12T07:04:13.862Z] E                   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[2021-08-12T07:04:13.862Z] E                   	... 1 more
[2021-08-12T07:04:13.862Z] E                   Caused by: org.apache.spark.SparkUpgradeException: You may get a different result due to the upgrading of Spark 3.0: writing dates before 1582-10-15 or timestamps before 1900-01-01T00:00:00Z into Parquet INT96 files can be dangerous, as the files may be read by Spark 2.x or legacy versions of Hive later, which uses a legacy hybrid calendar that is different from Spark 3.0+'s Proleptic Gregorian calendar. See more details in SPARK-31404. You can set spark.sql.legacy.parquet.int96RebaseModeInWrite to 'LEGACY' to rebase the datetime values w.r.t. the calendar difference during writing, to get maximum interoperability. Or set spark.sql.legacy.parquet.int96RebaseModeInWrite to 'CORRECTED' to write the datetime values as it is, if you are 100% sure that the written files will only be read by Spark 3.0+ or other systems that use Proleptic Gregorian calendar.
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.DataSourceUtils$.newRebaseExceptionInWrite(DataSourceUtils.scala:165)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.DataSourceUtils$.$anonfun$creteTimestampRebaseFuncInWrite$1(DataSourceUtils.scala:209)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$makeWriter$10(ParquetWriteSupport.scala:208)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$makeWriter$10$adapted(ParquetWriteSupport.scala:207)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$makeMapWriter$6(ParquetWriteSupport.scala:450)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.consumeField(ParquetWriteSupport.scala:476)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$makeMapWriter$4(ParquetWriteSupport.scala:450)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.consumeGroup(ParquetWriteSupport.scala:470)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$makeMapWriter$3(ParquetWriteSupport.scala:442)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.consumeField(ParquetWriteSupport.scala:476)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$makeMapWriter$2(ParquetWriteSupport.scala:439)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.consumeGroup(ParquetWriteSupport.scala:470)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$makeMapWriter$1(ParquetWriteSupport.scala:438)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$makeMapWriter$1$adapted(ParquetWriteSupport.scala:431)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$writeFields$1(ParquetWriteSupport.scala:158)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.consumeField(ParquetWriteSupport.scala:476)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.writeFields(ParquetWriteSupport.scala:158)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.$anonfun$write$1(ParquetWriteSupport.scala:148)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.consumeMessage(ParquetWriteSupport.scala:464)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.write(ParquetWriteSupport.scala:148)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.write(ParquetWriteSupport.scala:54)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.parquet.hadoop.InternalParquetRecordWriter.write(InternalParquetRecordWriter.java:128)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:182)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:44)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.write(ParquetOutputWriter.scala:39)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.write(FileFormatDataWriter.scala:140)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:278)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1473)
[2021-08-12T07:04:13.862Z] E                   	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:286)
[2021-08-12T07:04:13.862Z] E                   	... 9 more

@pxLi pxLi changed the title [BUG] build failures parquet_test test:read_round_trip [BUG] failures parquet_test test:read_round_trip Aug 12, 2021
@pxLi
Copy link
Collaborator

pxLi commented Aug 12, 2021

12 test failures

FAILED src/main/python/parquet_test.py::test_read_round_trip[-reader_confs0-read_parquet_df-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.866Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[-reader_confs0-read_parquet_sql-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.866Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[-reader_confs1-read_parquet_df-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.866Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[-reader_confs1-read_parquet_sql-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.866Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[-reader_confs2-read_parquet_df-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.866Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[-reader_confs2-read_parquet_sql-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.866Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[parquet-reader_confs0-read_parquet_df-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.866Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[parquet-reader_confs0-read_parquet_sql-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.867Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[parquet-reader_confs1-read_parquet_df-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.867Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[parquet-reader_confs1-read_parquet_sql-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.867Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[parquet-reader_confs2-read_parquet_df-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]
[2021-08-12T07:04:13.867Z] FAILED src/main/python/parquet_test.py::test_read_round_trip[parquet-reader_confs2-read_parquet_sql-[Byte, Short, Integer, Long, Float, Double, String, Boolean, Date, Timestamp, Array(Byte), Array(Long), Array(String), Array(Date), Array(Timestamp), Array(Decimal(18,0)), Array(Array(Byte)), Struct(['child0', Array(Byte)],['child1', Byte],['child2', Float],['child3', Decimal(18,0)]), Array(Struct(['child0', String],['child1', Double],['child2', Integer])), Map(Boolean(not_null),Boolean), Map(Byte(not_null),Byte), Map(Short(not_null),Short), Map(Integer(not_null),Integer), Map(Long(not_null),Long), Map(Float(not_null),Float), Map(Double(not_null),Double), Map(Date(not_null),Date), Map(Timestamp(not_null),Timestamp), Map(String(not_null),String), Map(String(not_null),Array(String)), Map(RepeatSeq(not_null)(Integer(not_null)),Long), Map(String(not_null),Map(String(not_null),String)), Decimal(18,0), Decimal(7,3), Decimal(10,10), Decimal(9,0), Decimal(18,15)]]

@sameerz sameerz removed the ? - Needs Triage Need team to review and classify label Aug 12, 2021
@sameerz sameerz added this to the Aug 2 - Aug 13 milestone Aug 12, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working P0 Must have for release
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants