We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not able to get a streaming response because 'LegacyAPIResponse' object is not iterable.
The following code:
import os import openai from openai import AzureOpenAI os.environ['AZURE_OPENAI_API_KEY'] = "xxx" client = AzureOpenAI( api_version="2023-07-01-preview", azure_endpoint="https://xxx.openai.azure.com/", ) stream = client.chat.completions.with_raw_response.create( messages=[{ "role": "user", "content": "sing me a song", }], model="gpt-35-turbo", max_tokens=30, temperature=0.7, stream=True ) print(stream) for chunk in stream: print(chunk.choices[0].delta.content or "", end="")
Results in:
<APIResponse [200 OK] type=<class 'openai.types.chat.chat_completion.ChatCompletion'>> Traceback (most recent call last): File "test.py", line 27, in <module> for chunk in stream: TypeError: 'LegacyAPIResponse' object is not iterable
No response
WSL2 Ubuntu
Python v3.9.7
openai-python v1.10.0
The text was updated successfully, but these errors were encountered:
This is because you need to actually call .parse() to get the stream class, e.g.
.parse()
response = client.chat.completions.with_raw_response.create( messages=[{ "role": "user", "content": "sing me a song", }], model="gpt-35-turbo", max_tokens=30, temperature=0.7, stream=True ) stream = response.parse() for chunk in stream: ...
Sorry, something went wrong.
Working code:
import os import openai from openai import AzureOpenAI assert openai.__version__ == "1.10.0" os.environ['AZURE_OPENAI_API_KEY'] = "xxx" client = AzureOpenAI( api_version="2023-07-01-preview", azure_endpoint="https://xxx.openai.azure.com/", ) response = client.chat.completions.with_raw_response.create( messages=[{ "role": "user", "content": "sing me a song", }], model="gpt-35-turbo", max_tokens=30, temperature=0.7, stream=True ) stream = response.parse() for chunk in stream: if (len(chunk.choices)) > 0: print(chunk.choices[0].delta.content or "", end="")
No branches or pull requests
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
Not able to get a streaming response because 'LegacyAPIResponse' object is not iterable.
To Reproduce
The following code:
Results in:
Code snippets
No response
OS
WSL2 Ubuntu
Python version
Python v3.9.7
Library version
openai-python v1.10.0
The text was updated successfully, but these errors were encountered: