-
Notifications
You must be signed in to change notification settings - Fork 215
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Async server to handle multiple requests #109
Conversation
This Server Handles Concurrency and it capable to server multiple clients as once
Thanks a lot for putting in the work. Will test that soon, there are some issues that I see already:
def play_text_to_speech(self, text):
self.speaking = True
try:
self.stream.feed(text)
logging.debug(f"Playing audio for text: {text}")
print(f"Synthesizing: \"{text}\"")
self.stream.play_async(on_audio_chunk=self.on_audio_chunk, muted=True)
finally:
self.speaking = False self.speaking will instantly be False again, because feed and play_async are both nonblocking.
@app.get("/tts")
def tts(request: Request, text: str = Query(...)):
with tts_lock:
request_handler = TTSRequestHandler(current_engine)
browser_request = is_browser_request(request)
if play_text_to_speech_semaphore.acquire(blocking=False):
try:
if not request_handler.speaking:
threading.Thread(
target=request_handler.play_text_to_speech,
args=(text,),
daemon=True).start()
finally:
play_text_to_speech_semaphore.release()
return StreamingResponse(
request_handler.audio_chunk_generator(browser_request),
media_type="audio/wav"
) Even if request_handler.speaking could be True - so in the case we are already speaking, we don't start a play thread and just do nothing here?
Thx again, will test as soon as I find time... |
1. Removed tts-text (duplicate endpoint) 2. handler request_handler.speaking = False on on_audio_stream_stop 3. Responding back with 500 code and Error Message when request_handler is already busy speaking/playing.
Here is what I have updated - Pt1.
Pt2.
Pt3. |
Thanks a lot, now it looks good! |
Moved the async_server.py file to the example_fast_api folder. |
Thanks. I was about to 😅 |
This Server Handles Concurrency and it capable to serve multiple clients in parallel