Skip to content
This repository has been archived by the owner on Nov 16, 2019. It is now read-only.

How can I use more cpus in cpu mode? #258

Open
guyang88 opened this issue May 19, 2017 · 1 comment
Open

How can I use more cpus in cpu mode? #258

guyang88 opened this issue May 19, 2017 · 1 comment

Comments

@guyang88
Copy link

guyang88 commented May 19, 2017

@junshi15 @anfeng I run caffeonspark on cpu grid(32coresX10) .In spark_on_yarn mode ,how many the number of "executor-cores" should I set? I set the number >1,then I got error . And in spark_standalone mode ,how many the number of "core_per_worker" should set? In the same spark_work_instances ,more executor-cores, can task have higher efficiency?

@junshi15
Copy link
Collaborator

in YARN mode, spark.executor.cores = 1.
https://spark.apache.org/docs/latest/configuration.html

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants