Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: deadlock between background job and requests #720

Merged
merged 2 commits into from
Nov 7, 2023
Merged

Conversation

wsxiaoys
Copy link
Member

@wsxiaoys wsxiaoys commented Nov 7, 2023

In background job thread, we first lock requests, then lock engine, while in generate_stream thread, we first lock engine, then lock requests.

This creates a classic deadlock.

Fix #718

@wsxiaoys wsxiaoys merged commit 1ad0d39 into main Nov 7, 2023
5 checks passed
@wsxiaoys wsxiaoys deleted the fix-tokio-deadlock branch November 7, 2023 21:11
wsxiaoys added a commit that referenced this pull request Nov 7, 2023
* fix: deadlock between background job and requests

* refactor: extract LlamaService
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Tabby server stops responding to requests after extended use
1 participant