You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
"Sequence" function fails with "java.lang.UnsupportedOperationException: Not supported on UnsafeArrayData".
Full stacktrace:
java.lang.UnsupportedOperationException: Not supported on UnsafeArrayData.
at org.apache.spark.sql.catalyst.expressions.UnsafeArrayData.array(UnsafeArrayData.java:108)
at com.nvidia.spark.rapids.GpuScalar$.from(literals.scala:300)
at com.nvidia.spark.rapids.GpuScalar.getBase(literals.scala:442)
at com.nvidia.spark.rapids.GpuColumnVector.from(GpuColumnVector.java:856)
at com.nvidia.spark.rapids.GpuExpressionsUtils$.$anonfun$resolveColumnVector$1(GpuExpressions.scala:74)
at com.nvidia.spark.rapids.Arm.withResourceIfAllowed(Arm.scala:73)
at com.nvidia.spark.rapids.Arm.withResourceIfAllowed$(Arm.scala:71)
at com.nvidia.spark.rapids.GpuExpressionsUtils$.withResourceIfAllowed(GpuExpressions.scala:29)
at com.nvidia.spark.rapids.GpuExpressionsUtils$.resolveColumnVector(GpuExpressions.scala:72)
at com.nvidia.spark.rapids.GpuExpressionsUtils$.columnarEvalToColumn(GpuExpressions.scala:93)
at com.nvidia.spark.rapids.GpuProjectExec$.projectSingle(basicPhysicalOperators.scala:70)
at com.nvidia.spark.rapids.GpuProjectExec$.$anonfun$project$1(basicPhysicalOperators.scala:77)
at com.nvidia.spark.rapids.RapidsPluginImplicits$MapsSafely.$anonfun$safeMap$1(implicits.scala:162)
at com.nvidia.spark.rapids.RapidsPluginImplicits$MapsSafely.$anonfun$safeMap$1$adapted(implicits.scala:159)
at scala.collection.immutable.List.foreach(List.scala:392)
at com.nvidia.spark.rapids.RapidsPluginImplicits$MapsSafely.safeMap(implicits.scala:159)
at com.nvidia.spark.rapids.RapidsPluginImplicits$AutoCloseableProducingSeq.safeMap(implicits.scala:194)
at com.nvidia.spark.rapids.GpuProjectExec$.project(basicPhysicalOperators.scala:77)
at com.nvidia.spark.rapids.GpuProjectExec$.projectAndClose(basicPhysicalOperators.scala:41)
at com.nvidia.spark.rapids.GpuProjectExec.$anonfun$doExecuteColumnar$1(basicPhysicalOperators.scala:119)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at com.nvidia.spark.rapids.ColumnarToRowIterator.$anonfun$fetchNextBatch$2(GpuColumnarToRowExec.scala:224)
at com.nvidia.spark.rapids.Arm.withResource(Arm.scala:28)
at com.nvidia.spark.rapids.Arm.withResource$(Arm.scala:26)
at com.nvidia.spark.rapids.ColumnarToRowIterator.withResource(GpuColumnarToRowExec.scala:177)
at com.nvidia.spark.rapids.ColumnarToRowIterator.fetchNextBatch(GpuColumnarToRowExec.scala:221)
at com.nvidia.spark.rapids.ColumnarToRowIterator.loadNextBatch(GpuColumnarToRowExec.scala:198)
at com.nvidia.spark.rapids.ColumnarToRowIterator.hasNext(GpuColumnarToRowExec.scala:238)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:898)
at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:898)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Steps/Code to reproduce bug
spark.sql("SELECT sequence(1, 5)").collect()
Expected behavior
The query should not fail.
It should fallback on CPU if sequence function is not supported.
Environment details (please complete the following information)
Describe the bug
"Sequence" function fails with "java.lang.UnsupportedOperationException: Not supported on UnsafeArrayData".
Full stacktrace:
Steps/Code to reproduce bug
Expected behavior
The query should not fail.
It should fallback on CPU if sequence function is not supported.
Environment details (please complete the following information)
Spark 3.1.1 standalone cluster
21.10 snapshot jar
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: