Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Response options in gr.Chatbot cause frozen / unexpected behavior in streaming scenarios. #10102

Closed
1 task done
zefu-lu opened this issue Dec 3, 2024 · 5 comments · Fixed by #10123
Closed
1 task done
Assignees
Labels
bug Something isn't working

Comments

@zefu-lu
Copy link

zefu-lu commented Dec 3, 2024

Describe the bug

Related with new features merged in #9989. @abid

While the demo from that pull request works, the new feature actually doesn't work well for streaming scenarios.

One would expect to see the option buttons after the message is fully rendered. Instead, the options show up from the beginning.

Moreover, if one of yielded message doesn't provide options, it would cause the ChatInterface to freeze. See Reproduction.

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

import gradio as gr

def example(message, history):
    for i in range(len(message)):
        yield {"role": "assistant", "content": message[:i+1]}
    yield {"role": "assistant", "content": message, "options":[{"value": "Yes, that's correct.", "label": "Yes"}, {"value": "No"}]}

demo = gr.ChatInterface(
    example,
    type="messages",
)

if __name__ == "__main__":
    demo.launch()

Screenshot

No response

Logs

No response

System Info

Gradio Environment Information:
------------------------------
Operating System: Linux
gradio version: 5.7.1
gradio_client version: 1.5.0

------------------------------------------------
gradio dependencies in your environment:

aiofiles: 23.2.1
anyio: 4.6.2.post1
audioop-lts is not installed.
fastapi: 0.115.5
ffmpy: 0.4.0
gradio-client==1.5.0 is not installed.
httpx: 0.27.2
huggingface-hub: 0.26.2
jinja2: 3.1.4
markupsafe: 2.1.5
numpy: 2.1.3
orjson: 3.10.11
packaging: 24.2
pandas: 2.2.3
pillow: 11.0.0
pydantic: 2.9.2
pydub: 0.25.1
python-multipart==0.0.12 is not installed.
pyyaml: 6.0.2
ruff: 0.7.4
safehttpx: 0.1.1
semantic-version: 2.10.0
starlette: 0.41.3
tomlkit==0.12.0 is not installed.
typer: 0.13.1
typing-extensions: 4.12.2
urllib3: 2.2.3
uvicorn: 0.32.0
authlib; extra == 'oauth' is not installed.
itsdangerous; extra == 'oauth' is not installed.


gradio_client dependencies in your environment:

fsspec: 2024.10.0
httpx: 0.27.2
huggingface-hub: 0.26.2
packaging: 24.2
typing-extensions: 4.12.2
websockets: 12.0

Severity

I can work around it

@zefu-lu zefu-lu added the bug Something isn't working label Dec 3, 2024
@abidlabs abidlabs self-assigned this Dec 3, 2024
@abidlabs
Copy link
Member

abidlabs commented Dec 3, 2024

Hmm I'm not able to reproduce exactly how you are describing? Is it just that the text is being returned extremely quickly from the backend? What happens if you add a time.sleep() in each iteration of the for loop.

If you can share a screen recording of the behavior you are seeing, that would be helpful. Thanks!

@zefu-lu
Copy link
Author

zefu-lu commented Dec 4, 2024

Hmm I'm not able to reproduce exactly how you are describing? Is it just that the text is being returned extremely quickly from the backend? What happens if you add a time.sleep() in each iteration of the for loop.

If you can share a screen recording of the behavior you are seeing, that would be helpful. Thanks!

2024-12-04.14-44-00.mp4

Sure, the above is the video. The code I am running is the same as in production (with time.sleep(0.01) in the for loop).

As you can see, the whole interface froze at the very end. Retry/Undo buttons are not active, and also the textbox itself is not interactive anymore.

I just notice that if I change the yield statement in the for loop to add options the same as in the last yield statement outside the loop, the whole thing would work smoothly. However, it is not useful in the real world scenarios, as we usually only know what kind of options to give the users after we get the entire response from llms.

Hope it clarified things. Thanks!

@abidlabs
Copy link
Member

abidlabs commented Dec 4, 2024

Ok yep I can repro this, thanks so much! Looking into this

@abidlabs
Copy link
Member

abidlabs commented Dec 4, 2024

Should be fixed via #10123, if you'd like to test it out (see instructions in PR body)

@zefu-lu
Copy link
Author

zefu-lu commented Dec 5, 2024

Works now! Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants