Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vllm_worker returning LIST instead of STR causes error in opeanai_api_server.py #2399

Closed
dhgarcia opened this issue Sep 11, 2023 · 6 comments
Labels
bug Something isn't working

Comments

@dhgarcia
Copy link

dhgarcia commented Sep 11, 2023

This is the error message

2023-09-11 18:09:50 | INFO | stdout | INFO: 137.195.243.7:45492 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
2023-09-11 18:09:50 | ERROR | stderr | ERROR: Exception in ASGI application
2023-09-11 18:09:50 | ERROR | stderr | Traceback (most recent call last):
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
2023-09-11 18:09:50 | ERROR | stderr | result = await app( # type: ignore[func-returns-value]
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
2023-09-11 18:09:50 | ERROR | stderr | return await self.app(scope, receive, send)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/fastapi/applications.py", line 292, in call
2023-09-11 18:09:50 | ERROR | stderr | await super().call(scope, receive, send)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/applications.py", line 122, in call
2023-09-11 18:09:50 | ERROR | stderr | await self.middleware_stack(scope, receive, send)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in call
2023-09-11 18:09:50 | ERROR | stderr | raise exc
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in call
2023-09-11 18:09:50 | ERROR | stderr | await self.app(scope, receive, _send)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in call
2023-09-11 18:09:50 | ERROR | stderr | await self.app(scope, receive, send)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in call
2023-09-11 18:09:50 | ERROR | stderr | raise exc
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in call
2023-09-11 18:09:50 | ERROR | stderr | await self.app(scope, receive, sender)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in call
2023-09-11 18:09:50 | ERROR | stderr | raise e
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in call
2023-09-11 18:09:50 | ERROR | stderr | await self.app(scope, receive, send)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/routing.py", line 718, in call
2023-09-11 18:09:50 | ERROR | stderr | await route.handle(scope, receive, send)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
2023-09-11 18:09:50 | ERROR | stderr | await self.app(scope, receive, send)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
2023-09-11 18:09:50 | ERROR | stderr | response = await func(request)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/fastapi/routing.py", line 273, in app
2023-09-11 18:09:50 | ERROR | stderr | raw_response = await run_endpoint_function(
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/fastapi/routing.py", line 190, in run_endpoint_function
2023-09-11 18:09:50 | ERROR | stderr | return await dependant.call(**values)
2023-09-11 18:09:50 | ERROR | stderr | File "/home/remote/dh143/Environments/anaconda3/envs/llm-spring/lib/python3.10/site-packages/fastchat/serve/openai_api_server.py", line 408, in create_chat_completion
2023-09-11 18:09:50 | ERROR | stderr | message=ChatMessage(role="assistant", content=content["text"]),
2023-09-11 18:09:50 | ERROR | stderr | File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init
2023-09-11 18:09:50 | ERROR | stderr | pydantic.error_wrappers.ValidationError: 1 validation error for ChatMessage
2023-09-11 18:09:50 | ERROR | stderr | content
2023-09-11 18:09:50 | ERROR | stderr | str type expected (type=type_error.str)

commit a5e6abf introduced the changes in vllm_worker.py that are causing this behaviour

@merrymercy
Copy link
Member

It seems to be fixed by #2398
@leiwen83 Could you help to verify?

@merrymercy merrymercy added the bug Something isn't working label Sep 11, 2023
@dhgarcia
Copy link
Author

It seems to be fixed by #2398 @leiwen83 Could you help to verify?

Okay, but that works as a patch for the openai_api_server.
But is that fixxing the issue in the right place?
This makes now the format of the output from models running with vllm_worker.py different than with model_worker.py

What is the reason for this?

Users with different apis that the openai_api_server will now need to be aware and deal accordingly with this difference in behaviours.

@merrymercy
Copy link
Member

merrymercy commented Sep 11, 2023

I see. Your arguments are correct. We should make their output formats the same.

Could you contribute a PR to unify their outputs?

@merrymercy
Copy link
Member

merrymercy commented Sep 11, 2023

@leiwen83 Sorry but I have to revert #2372 again to address these issues. Please submit a new PR with all bugs fixed.

  1. model_worker and vllm_worker should return the same format (this issue).
  2. Avoid the bug in bugfix of openai_api_server for fastchat.serve.vllm_worker #2398
  3. Please pass all tests under https://github.com/lm-sys/FastChat/blob/main/docs/commands/test_process.md#test-openai-api-server. I added some new ones.

@leiwen83
Copy link
Contributor

@leiwen83 Sorry but I have to revert #2372 again to address these issues. Please submit a new PR with all bugs fixed.

  1. model_worker and vllm_worker should return the same format (this issue).
  2. Avoid the bug in bugfix of openai_api_server for fastchat.serve.vllm_worker #2398
  3. Please pass all tests under https://github.com/lm-sys/FastChat/blob/main/docs/commands/test_process.md#test-openai-api-server. I added some new ones.

Got it, I would resend the PR.

@leiwen83
Copy link
Contributor

@merrymercy I resend latest PR fixing issues: #2442. All bugs mentioned are fixed and tests are passed.

Please help review~

@dhgarcia dhgarcia closed this as completed Oct 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants