Skip to content

Use gradle cache in github action #33

Use gradle cache in github action

Use gradle cache in github action #33

Re-run triggered July 9, 2024 01:23
Status Failure
Total duration 2m 30s
Artifacts

check.yml

on: push
Fit to window
Zoom out
Zoom in

Annotations

2 errors and 4 warnings
build
Failed to run application/job, command=java -jar app/build/libs/data-caterer.jar
build
Error: Command failed: java -jar app/build/libs/data-caterer.jar WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/home/runner/work/data-caterer/data-caterer/app/build/libs/data-caterer.jar) to constructor java.nio.DirectByteBuffer(long,int) WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release Exception in thread "main" java.lang.RuntimeException: org.apache.spark.sql.AnalysisException: Column account_status not found in schema Some(StructType(StructField(account_number,StringType,false),StructField(create_time,TimestampType,false),StructField(balance,DoubleType,true))). at io.github.datacatering.datacaterer.core.sink.SinkFactory.saveData(SinkFactory.scala:52) at io.github.datacatering.datacaterer.core.sink.SinkFactory.pushToSink(SinkFactory.scala:36) at io.github.datacatering.datacaterer.core.generator.BatchDataProcessor.$anonfun$pushDataToSinks$4(BatchDataProcessor.scala:115) at scala.collection.immutable.List.map(List.scala:293) at io.github.datacatering.datacaterer.core.generator.BatchDataProcessor.pushDataToSinks(BatchDataProcessor.scala:109) at io.github.datacatering.datacaterer.core.generator.BatchDataProcessor.$anonfun$splitAndProcess$2(BatchDataProcessor.scala:93) at io.github.datacatering.datacaterer.core.generator.BatchDataProcessor.$anonfun$splitAndProcess$2$adapted(BatchDataProcessor.scala:35) at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293) at scala.collection.immutable.Range.foreach(Range.scala:158) at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293) at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108) at io.github.datacatering.datacaterer.core.generator.BatchDataProcessor.splitAndProcess(BatchDataProcessor.scala:35) at io.github.datacatering.datacaterer.core.generator.DataGeneratorProcessor.generateDataWithResult(DataGeneratorProcessor.scala:57) at io.github.datacatering.datacaterer.core.generator.DataGeneratorProcessor.generateData(DataGeneratorProcessor.scala:42) at io.github.datacatering.datacaterer.core.generator.DataGeneratorProcessor.generateData(DataGeneratorProcessor.scala:36) at io.github.datacatering.datacaterer.core.plan.PlanProcessor$.executePlanWithConfig(PlanProcessor.scala:56) at io.github.datacatering.datacaterer.core.plan.PlanProcessor$.executePlan(PlanProcessor.scala:47) at io.github.datacatering.datacaterer.core.plan.PlanProcessor$.$anonfun$determineAndExecutePlan$8(PlanProcessor.scala:33) at scala.Option.getOrElse(Option.scala:189) at io.github.datacatering.datacaterer.core.plan.PlanProcessor$.$anonfun$determineAndExecutePlan$6(PlanProcessor.scala:33) at scala.Option.getOrElse(Option.scala:189) at io.github.datacatering.datacaterer.core.plan.PlanProcessor$.determineAndExecutePlan(PlanProcessor.scala:33) at io.github.datacatering.datacaterer.App$.main(App.scala:18) at io.github.datacatering.datacaterer.App.main(App.scala) Caused by: org.apache.spark.sql.AnalysisException: Column account_status not found in schema Some(StructType(StructField(account_number,StringType,false),StructField(create_time,TimestampType,false),StructField(balance,DoubleType,true))). at org.apache.spark.sql.errors.QueryCompilationErrors$.columnNotFoundInSchemaError(QueryCompilationErrors.scala:1693) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$getInsertStatement$4(JdbcUtils.scala:126) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$getInsertStatement$2(JdbcUtils.scala:126) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36) at scala.collection.Index
build
Error: Command failed: docker rm data-caterer Error response from daemon: No such container: data-caterer
build
The following actions uses Node.js version which is deprecated and will be forced to run on node20: burrunan/gradle-cache-action@v1, actions/github-script@v6. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
build
distributionSha256Sum is not set in gradle/wrapper/gradle-wrapper.properties. Please consider adding the checksum, see https://docs.gradle.org/current/userguide/gradle_wrapper.html#configuring_checksum_verification
build
distributionSha256Sum is not set in gradle/wrapper/gradle-wrapper.properties. Please consider adding the checksum, see https://docs.gradle.org/current/userguide/gradle_wrapper.html#configuring_checksum_verification