Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support ANSI intervals to/from Parquet #4810

Merged
Prev Previous commit
Next Next commit
Update Comments
  • Loading branch information
Chong Gao committed Feb 23, 2022
commit ed5c56e526fb6d6587f8f0f85adcbfe054752fcc
Original file line number Diff line number Diff line change
@@ -74,8 +74,6 @@ case class GpuTimeAdd(start: Expression,
}
calendarI.days * microSecondsInOneDay + calendarI.microseconds
case _: DayTimeIntervalType =>
// Scalar does not support 'DayTimeIntervalType' now, so use
// the Scala value instead.
intervalS.getValue.asInstanceOf[Long]
case _ =>
throw new UnsupportedOperationException(
Original file line number Diff line number Diff line change
@@ -458,7 +458,8 @@ public void releaseReferences() {

private static DType toRapidsOrNull(DataType type) {
DType ret = toRapidsOrNullDefaultImpl(type);
firestarman marked this conversation as resolved.
Show resolved Hide resolved
// Check types that shim supporting, e.g.: Spark 3.2.0 supports AnsiIntervalType
// Check types that shim supporting
// e.g.: Spark 3.3.0 begin supporting AnsiIntervalType to/from parquet
return (ret != null) ? ret : GpuTypeShims.toRapidsOrNull(type);
}

2 changes: 1 addition & 1 deletion tests/pom.xml
Original file line number Diff line number Diff line change
@@ -265,7 +265,7 @@
<goals><goal>add-test-source</goal></goals>
<configuration>
<sources>
<!-- Add cases that can't be compiled before Spark 330 -->
<!-- some test cases that can't be compiled before Spark 330 -->
<source>${project.basedir}/src/test/330+/scala</source>
</sources>
</configuration>