Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

switched multiprocessing to torch.multiprocessing #22

Merged
merged 1 commit into from
Dec 20, 2023

Conversation

lendot
Copy link
Contributor

@lendot lendot commented Dec 17, 2023

The coqui engine's stock python subprocessing can cause problems when used with software that makes CUDA calls elsewhere (e.g. xtts_api_server). This fix switches from the multiprocessing module over to torch.multiprocessing.

@KoljaB
Copy link
Owner

KoljaB commented Dec 17, 2023

Thanks a lot for pointing this out, will do some tests with this approach soon. This looks very promising.

@KoljaB KoljaB merged commit 8d8ad51 into KoljaB:master Dec 20, 2023
KoljaB added a commit that referenced this pull request Dec 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants