You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
All integration tests Caused by: java.lang.ClassNotFoundException: com.nvidia.spark.rapids.tests.datasourcev2.parquet.ArrowColumnarDataSourceV2.DefaultSource
[2021-09-22T00:32:59.458Z] if is_error(answer)[0]:
[2021-09-22T00:32:59.458Z] if len(answer) > 1:
[2021-09-22T00:32:59.458Z] type = answer[1]
[2021-09-22T00:32:59.458Z] value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
[2021-09-22T00:32:59.458Z] if answer[1] == REFERENCE_TYPE:
[2021-09-22T00:32:59.458Z] > raise Py4JJavaError(
[2021-09-22T00:32:59.458Z] "An error occurred while calling {0}{1}{2}.\n".
[2021-09-22T00:32:59.458Z] format(target_id, ".", name), value)
[2021-09-22T00:32:59.458Z] E py4j.protocol.Py4JJavaError: An error occurred while calling o361.load.
[2021-09-22T00:32:59.458Z] E : java.lang.ClassNotFoundException: Failed to find data source: com.nvidia.spark.rapids.tests.datasourcev2.parquet.ArrowColumnarDataSourceV2. Please find packages at http://spark.apache.org/third-party-projects.html
[2021-09-22T00:32:59.458Z] E at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:679)
[2021-09-22T00:32:59.459Z] E at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSourceV2(DataSource.scala:733)
[2021-09-22T00:32:59.459Z] E at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:248)
[2021-09-22T00:32:59.459Z] E at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:221)
[2021-09-22T00:32:59.459Z] E at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[2021-09-22T00:32:59.459Z] E at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[2021-09-22T00:32:59.459Z] E at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[2021-09-22T00:32:59.459Z] E at java.lang.reflect.Method.invoke(Method.java:498)
[2021-09-22T00:32:59.459Z] E at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
[2021-09-22T00:32:59.459Z] E at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
[2021-09-22T00:32:59.459Z] E at py4j.Gateway.invoke(Gateway.java:282)
[2021-09-22T00:32:59.459Z] E at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
[2021-09-22T00:32:59.459Z] E at py4j.commands.CallCommand.execute(CallCommand.java:79)
[2021-09-22T00:32:59.459Z] E at py4j.GatewayConnection.run(GatewayConnection.java:238)
[2021-09-22T00:32:59.459Z] E at java.lang.Thread.run(Thread.java:748)
[2021-09-22T00:32:59.459Z] E Caused by: java.lang.ClassNotFoundException: com.nvidia.spark.rapids.tests.datasourcev2.parquet.ArrowColumnarDataSourceV2.DefaultSource
[2021-09-22T00:32:59.459Z] E at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
[2021-09-22T00:32:59.459Z] E at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
[2021-09-22T00:32:59.459Z] E at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
[2021-09-22T00:32:59.459Z] E at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$5(DataSource.scala:653)
[2021-09-22T00:32:59.459Z] E at scala.util.Try$.apply(Try.scala:213)
[2021-09-22T00:32:59.459Z] E at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$4(DataSource.scala:653)
[2021-09-22T00:32:59.459Z] E at scala.util.Failure.orElse(Try.scala:224)
[2021-09-22T00:32:59.459Z] E at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:653)
[2021-09-22T00:32:59.459Z] E ... 14 more
[2021-09-22T00:32:59.459Z]
I bet these are not picking up the shim specific version of integration tests jar. I seem to remember thinking the scripts would still work but we put in a change for the run script to look specifically for jar with classifier version
Describe the bug
All integration tests
Caused by: java.lang.ClassNotFoundException: com.nvidia.spark.rapids.tests.datasourcev2.parquet.ArrowColumnarDataSourceV2.DefaultSource
failure list,
The text was updated successfully, but these errors were encountered: