-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ensure we cancel consumeUint8ArrayReadableStream
if iteration breaks
#428
Conversation
🦋 Changeset detectedLatest commit: 6ffc30e The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
5e94b45
to
1ef675e
Compare
1ef675e
to
ec5b63a
Compare
ec5b63a
to
ea0e511
Compare
@jridgewell: the amount of real-world dollars that this change will save (not transfer to OpenAI but keep in the dev's pockets) cannot be overstated :) |
### What? The updates `edge-runtime` to the latest version ### Why? vercel/edge-runtime#428 fixes `consumeUint8ArrayReadableStream` so that when we break iteration early (due to client disconnect), we cleanup the inner stream. That will fire the stream's `cancel` handler, and allow devs to disconnect from an AI service fetch. ### How? `edge-runtime` maintain a `try {} finally {}` over the inner stream's iteration. When we early break, JS will call `it.return()`, and that will resume `consumeUint8ArrayReadableStream` with an abrupt completion (essentially, the `yield` turns into a `return`). We'll be able to trigger the `finally {}` block with that, and we call `inner.cancel()` to cleanup. Fixes vercel/ai#90
Thanks to the Vercel team (@jridgewell), an interruption of the stream on the client side will lead to the cancellation of the TransformStream on the servers side, which in turns cancels the open fetch() to the upstream. This was a long needed change and we are happy to report it works well. Related: #114 - vercel/ai#90 - vercel/edge-runtime#428 - trpc/trpc#4586 (enormous thanks to the tRPC team to issue a quick release as well)
Right now,
consumeUint8ArrayReadableStream
never explicitly closes the the readable stream we're consuming from. Unfortunately that means that devs aren't able to release their resources, leaking them until the GC eventually runs.This is important for the AI streaming use case, so that devs can detect that a client has disconnected the stream and in turn disconnect the fetch they're maintaining to the AI service.