-
-
Notifications
You must be signed in to change notification settings - Fork 523
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add function to Service to allow stream the results back from a… #73
Conversation
… completion request
OpenAI.SDK/ObjectModels/RequestModels/CompletionCreateRequest.cs
Outdated
Show resolved
Hide resolved
@qbm5 when I test the method, it fetches the data as a stream but it waits until the stream completes. And it processes all of it at once. So I can't get the advantage of streaming data. Does it work for you too like that? |
Yes, I had the same experience. The issue is HTTP client is waiting until Stream completes. I will search a bit for samples. without yield return, the method will be useless. |
I just had success with doing it this way. It is sloppy, but it is showing promise. I can work on getting it in the pr a bit later. ` public async IAsyncEnumerable CreateCompletionAsStream(CompletionCreateRequest createCompletionRequest, string? modelId = null)
|
…t to set headers for the completion
I have pushed the update |
Awesome! I pushed some changes too, If you approve I am going to merge it. |
Add function to Service to allow stream the results back from a completion request