Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mac OS crash: "[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called. We cannot safely call it or ignore it in the fork() child process. Crashing instead." #374

Open
jwmatthews opened this issue Sep 18, 2024 · 1 comment · May be fixed by #367

Comments

@jwmatthews
Copy link
Member

We've introduced a regression recently in main that impacts Mac OS

objc[79384]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called. We cannot safely call it or ignore it in the fork() child process. Crashing instead. Set a breakpoint on objc_initializeAfterForkError to debug.

Thank you @jmontleon for identifying the workaround:
export OBJC_DISABLE_INITIALIZE_FORK_SAFETY=YES
make run-server

With above, we can run on Mac.

More reason for the workaround here:

Below is what you will see without the above workaroudn set on Mac:

File logging for 'kai' is set to level 'DEBUG' writing to file: '/Users/jmatthews/git/jwmatthews/kai/logs/kai_server.log'
INFO - 2024-09-18 09:52:42,606 - kai.service.kai_application.kai_application - [  kai_application.py:54   -             __init__()] - Tracing enabled.
INFO - 2024-09-18 09:52:42,609 - kai.service.kai_application.kai_application - [  kai_application.py:63   -             __init__()] - Selected provider: ChatIBMGenAI
INFO - 2024-09-18 09:52:42,609 - kai.service.kai_application.kai_application - [  kai_application.py:64   -             __init__()] - Selected model: meta-llama/llama-3-70b-instruct
objc[79384]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called.
objc[79384]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called. We cannot safely call it or ignore it in the fork() child process. Crashing instead. Set a breakpoint on objc_initializeAfterForkError to debug.
[2024-09-18 09:52:42 -0400] [79192] [ERROR] Worker (pid:79384) was sent SIGKILL! Perhaps out of memory?
objc[79385]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called.
objc[79385]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called. We cannot safely call it or ignore it in the fork() child process. Crashing instead. Set a breakpoint on objc_initializeAfterForkError to debug.
[2024-09-18 09:52:42 -0400] [79192] [ERROR] Worker (pid:79385) was sent SIGKILL! Perhaps out of memory?
[2024-09-18 09:52:42 -0400] [79387] [INFO] Booting worker with pid: 79387
Config loaded: KaiConfig(log_level='info', file_log_level='debug', log_dir='$pwd/logs', demo_mode=False, trace_enabled=True, gunicorn_workers=8, gunicorn_timeout=3600, gunicorn_bind='0.0.0.0:8080', incident_store=KaiConfigIncidentStore(solution_detectors=<SolutionDetectorKind.NAIVE: 'naive'>, solution_producers=<SolutionProducerKind.TEXT_ONLY: 'text_only'>, args=KaiConfigIncidentStorePostgreSQLArgs(provider=<KaiConfigIncidentStoreProvider.POSTGRESQL: 'postgresql'>, host='127.0.0.1', database='kai', user='kai', password='dog8code', connection_string=None, solution_detection=<SolutionDetectorKind.NAIVE: 'naive'>)), models=KaiConfigModels(provider='ChatIBMGenAI', args={'model_id': 'meta-llama/llama-3-70b-instruct', 'parameters': {'max_new_tokens': 2048}}, template=None, llama_header=None, llm_retries=5, llm_retry_delay=10.0), solution_consumers=[<SolutionConsumerKind.DIFF_ONLY: 'diff_only'>, <SolutionConsumerKind.LLM_SUMMARY: 'llm_summary'>])
Console logging for 'kai' is set to level 'INFO'
File logging for 'kai' is set to level 'DEBUG' writing to file: '/Users/jmatthews/git/jwmatthews/kai/logs/kai_server.log'
INFO - 2024-09-18 09:52:42,686 - kai.service.kai_application.kai_application - [  kai_application.py:54   -             __init__()] - Tracing enabled.
INFO - 2024-09-18 09:52:42,689 - kai.service.kai_application.kai_application - [  kai_application.py:63   -             __init__()] - Selected provider: ChatIBMGenAI
INFO - 2024-09-18 09:52:42,690 - kai.service.kai_application.kai_application - [  kai_application.py:64   -             __init__()] - Selected model: meta-llama/llama-3-70b-instruct
objc[79386]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called.
objc[79386]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called. We cannot safely call it or ignore it in the fork() child process. Crashing instead. Set a breakpoint on objc_initializeAfterForkError to debug.
[2024-09-18 09:52:42 -0400] [79192] [ERROR] Worker (pid:79386) was sent SIGKILL! Perhaps out of memory?
[2024-09-18 09:52:42 -0400] [79388] [INFO] Booting worker with pid: 79388
[2024-09-18 09:52:42 -0400] [79389] [INFO] Booting worker with pid: 79389

For those curious there is an Ansible issue filed which helped us to learn of the workaround: ansible/ansible#76322

@jmontleon
Copy link
Member

jmontleon commented Sep 18, 2024

This seems to have been introduced with:
fa08eb4

urllib3 and requests are implicated in the Ansible PR and this PR updated urllib3, so my guess is this is what brough this on.

@jmontleon jmontleon linked a pull request Sep 18, 2024 that will close this issue
@jmontleon jmontleon linked a pull request Sep 18, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants