How do I increase the memory of the Sparker driver while downloading Spark NLP for Healthcare? #321
Answered
by
JustHeroo
JustHeroo
asked this question in
sparknlp-healthcare
-
Hi, the Spark driver seems to be running our of memory while I'm downloading the healthcare models. How do I increase the memory of the driver specifically? I'm using the chunkresolve_icd10cm_clinical model with the version of sparknlp_jsl: 3.0.3 |
Beta Was this translation helpful? Give feedback.
Answered by
JustHeroo
Aug 25, 2021
Replies: 1 comment
-
Can you upgrade 3.1 and then use and select at least 8G memory and try jsl_ner_wip_clinical as well. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
JustHeroo
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Can you upgrade 3.1 and then use
sbiobertresolve_icd10cm_slim_billable_hcc
https://nlp.johnsnowlabs.com/2021/05/21/sbiobertresolve_icd10cm_slim_billable_hcc_en.html
and select at least 8G memory and try jsl_ner_wip_clinical as well.