Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Polish support for checkpointing #31

Closed
juanrh opened this issue Sep 23, 2015 · 2 comments
Closed

Polish support for checkpointing #31

juanrh opened this issue Sep 23, 2015 · 2 comments
Labels

Comments

@juanrh
Copy link
Owner

juanrh commented Sep 23, 2015

That should include, at least:

15/09/23 16:59:21 WARN SharedStreamingContext: stopping test Spark Streaming context
15/09/23 16:59:21 ERROR CheckpointWriter: Could not submit checkpoint task to the thread pool executor
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.streaming.CheckpointWriter$CheckpointWriteHandler@56e81962 rejected from java.util.concurrent.ThreadPoolExecutor@1fda9d81[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 61]
    at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
    at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
    at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
    at org.apache.spark.streaming.CheckpointWriter.write(Checkpoint.scala:244)
    at org.apache.spark.streaming.scheduler.JobGenerator.doCheckpoint(JobGenerator.scala:291)
    at org.apache.spark.streaming.scheduler.JobGenerator.org$apache$spark$streaming$scheduler$JobGenerator$$processEvent(JobGenerator.scala:180)
    at org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:83)
    at org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:82)
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
15/09/23 16:59:21 WARN SharedSparkContext: stopping test Spark context

this doesn't break the tests but it is not nice. We could try first stopping gracefully and only then stopping directly, and also try our luck in the spark mailing list

@juanrh
Copy link
Owner Author

juanrh commented Sep 24, 2015

Consider using the following code to stop the streaming context in DStreamProp.forall when a counterexample is found

Try { 
            ssc.stop(stopSparkContext = false, stopGracefully = true) 
          } recover {
            case _ => ssc.stop(stopSparkContext=false, stopGracefully=false)
          }

@juanrh
Copy link
Owner Author

juanrh commented Jan 11, 2016

fixed in 0.2.1-SNAPSHOT when using the trait DStreamTLProperty and calling `forAllDStream

@juanrh juanrh closed this as completed Jan 11, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant