Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: 'LegacyAPIResponse' object is not iterable #1115

Closed
1 task done
psymbio opened this issue Feb 1, 2024 · 2 comments
Closed
1 task done

TypeError: 'LegacyAPIResponse' object is not iterable #1115

psymbio opened this issue Feb 1, 2024 · 2 comments
Labels
question Further information is requested

Comments

@psymbio
Copy link

psymbio commented Feb 1, 2024

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Not able to get a streaming response because 'LegacyAPIResponse' object is not iterable.

To Reproduce

The following code:

import os
import openai
from openai import AzureOpenAI

os.environ['AZURE_OPENAI_API_KEY'] = "xxx"
client = AzureOpenAI(
    api_version="2023-07-01-preview",
    azure_endpoint="https://xxx.openai.azure.com/",
)

stream = client.chat.completions.with_raw_response.create(
    messages=[{
        "role": "user",
        "content": "sing me a song",
    }],
    model="gpt-35-turbo",
    max_tokens=30,
    temperature=0.7,
    stream=True
)
print(stream)
for chunk in stream:
    print(chunk.choices[0].delta.content or "", end="")

Results in:

<APIResponse [200 OK] type=<class 'openai.types.chat.chat_completion.ChatCompletion'>>
Traceback (most recent call last):
  File "test.py", line 27, in <module>
    for chunk in stream:
TypeError: 'LegacyAPIResponse' object is not iterable

Code snippets

No response

OS

WSL2 Ubuntu

Python version

Python v3.9.7

Library version

openai-python v1.10.0

@psymbio psymbio added the bug Something isn't working label Feb 1, 2024
@RobertCraigie RobertCraigie added question Further information is requested and removed bug Something isn't working labels Feb 1, 2024
@RobertCraigie
Copy link
Collaborator

This is because you need to actually call .parse() to get the stream class, e.g.

response = client.chat.completions.with_raw_response.create(
    messages=[{
        "role": "user",
        "content": "sing me a song",
    }],
    model="gpt-35-turbo",
    max_tokens=30,
    temperature=0.7,
    stream=True
)
stream = response.parse()
for chunk in stream:
  ...

@RobertCraigie RobertCraigie closed this as not planned Won't fix, can't repro, duplicate, stale Feb 1, 2024
@psymbio
Copy link
Author

psymbio commented Feb 1, 2024

Working code:

import os
import openai
from openai import AzureOpenAI
assert openai.__version__ == "1.10.0"

os.environ['AZURE_OPENAI_API_KEY'] = "xxx"

client = AzureOpenAI(
    api_version="2023-07-01-preview",
    azure_endpoint="https://xxx.openai.azure.com/",
)

response = client.chat.completions.with_raw_response.create(
    messages=[{
        "role": "user",
        "content": "sing me a song",
    }],
    model="gpt-35-turbo",
    max_tokens=30,
    temperature=0.7,
    stream=True
)
stream = response.parse()
for chunk in stream:
    if (len(chunk.choices)) > 0:
        print(chunk.choices[0].delta.content or "", end="")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants