-
Notifications
You must be signed in to change notification settings - Fork 27.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Web Streams cleanup #56819
Web Streams cleanup #56819
Conversation
Current dependencies on/for this PR:
This comment was auto-generated by Graphite. |
Tests Passed |
Stats from current PRDefault Build (Increase detected
|
vercel/next.js canary | vercel/next.js refactor/web-streams-cleanup | Change | |
---|---|---|---|
buildDuration | 10.3s | 10.4s | N/A |
buildDurationCached | 6.2s | 6.3s | |
nodeModulesSize | 174 MB | 175 MB | |
nextStartRea..uration (ms) | 510ms | 515ms | N/A |
Client Bundles (main, webpack)
vercel/next.js canary | vercel/next.js refactor/web-streams-cleanup | Change | |
---|---|---|---|
199-HASH.js gzip | 27.5 kB | 27.5 kB | N/A |
3f784ff6-HASH.js gzip | 53.3 kB | 53.3 kB | ✓ |
99.HASH.js gzip | 182 B | 182 B | ✓ |
framework-HASH.js gzip | 45.3 kB | 45.3 kB | ✓ |
main-app-HASH.js gzip | 254 B | 252 B | N/A |
main-HASH.js gzip | 32.9 kB | 32.9 kB | N/A |
webpack-HASH.js gzip | 1.75 kB | 1.75 kB | N/A |
Overall change | 98.8 kB | 98.8 kB | ✓ |
Legacy Client Bundles (polyfills)
vercel/next.js canary | vercel/next.js refactor/web-streams-cleanup | Change | |
---|---|---|---|
polyfills-HASH.js gzip | 31 kB | 31 kB | ✓ |
Overall change | 31 kB | 31 kB | ✓ |
Client Pages
vercel/next.js canary | vercel/next.js refactor/web-streams-cleanup | Change | |
---|---|---|---|
_app-HASH.js gzip | 206 B | 205 B | N/A |
_error-HASH.js gzip | 182 B | 180 B | N/A |
amp-HASH.js gzip | 506 B | 505 B | N/A |
css-HASH.js gzip | 322 B | 323 B | N/A |
dynamic-HASH.js gzip | 2.57 kB | 2.57 kB | N/A |
edge-ssr-HASH.js gzip | 260 B | 259 B | N/A |
head-HASH.js gzip | 350 B | 350 B | ✓ |
hooks-HASH.js gzip | 369 B | 369 B | ✓ |
image-HASH.js gzip | 4.35 kB | 4.35 kB | N/A |
index-HASH.js gzip | 256 B | 256 B | ✓ |
link-HASH.js gzip | 2.64 kB | 2.63 kB | N/A |
routerDirect..HASH.js gzip | 312 B | 311 B | N/A |
script-HASH.js gzip | 385 B | 384 B | N/A |
withRouter-HASH.js gzip | 307 B | 308 B | N/A |
1afbb74e6ecf..834.css gzip | 106 B | 106 B | ✓ |
Overall change | 1.08 kB | 1.08 kB | ✓ |
Client Build Manifests
vercel/next.js canary | vercel/next.js refactor/web-streams-cleanup | Change | |
---|---|---|---|
_buildManifest.js gzip | 485 B | 482 B | N/A |
Overall change | 0 B | 0 B | ✓ |
Rendered Page Sizes
vercel/next.js canary | vercel/next.js refactor/web-streams-cleanup | Change | |
---|---|---|---|
index.html gzip | 527 B | 529 B | N/A |
link.html gzip | 541 B | 542 B | N/A |
withRouter.html gzip | 523 B | 524 B | N/A |
Overall change | 0 B | 0 B | ✓ |
Edge SSR bundle Size Overall increase ⚠️
vercel/next.js canary | vercel/next.js refactor/web-streams-cleanup | Change | |
---|---|---|---|
edge-ssr.js gzip | 93.6 kB | 93.7 kB | N/A |
page.js gzip | 154 kB | 155 kB | |
Overall change | 154 kB | 155 kB |
Middleware size
vercel/next.js canary | vercel/next.js refactor/web-streams-cleanup | Change | |
---|---|---|---|
middleware-b..fest.js gzip | 624 B | 624 B | ✓ |
middleware-r..fest.js gzip | 150 B | 151 B | N/A |
middleware.js gzip | 22.5 kB | 22.5 kB | N/A |
edge-runtime..pack.js gzip | 1.92 kB | 1.92 kB | ✓ |
Overall change | 2.55 kB | 2.55 kB | ✓ |
Diff details
Diff for page.js
Diff too large to display
Diff for edge-ssr.js
Diff too large to display
29a5605
to
6b1b79a
Compare
if (process.env.NEXT_RUNTIME === 'edge') { | ||
const { decode } = | ||
require('../../shared/lib/base64-arraybuffer') as typeof import('../../shared/lib/base64-arraybuffer') | ||
decodedBody = decode(resData.body) | ||
} else { | ||
decodedBody = Buffer.from(resData.body, 'base64').subarray() | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was added before the edge runtime supported Buffer
, it now does, so we can remove!
export function createDecodeTransformStream(decoder = new TextDecoder()) { | ||
return new TransformStream<Uint8Array, string>({ | ||
transform(chunk, controller) { | ||
return controller.enqueue(decoder.decode(chunk, { stream: true })) | ||
}, | ||
}) | ||
} | ||
|
||
export function decodeText( | ||
input: Uint8Array | undefined, | ||
textDecoder: TextDecoder | ||
) { | ||
return textDecoder.decode(input, { stream: true }) | ||
export function createEncodeTransformStream(encoder = new TextEncoder()) { | ||
return new TransformStream<string, Uint8Array>({ | ||
transform(chunk, controller) { | ||
return controller.enqueue(encoder.encode(chunk)) | ||
}, | ||
}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally we can swap this out for TextEncoderStream and TextDecoderStream but this hasn't been polyfilled in the edge runtime yet 👀
// No, this cannot be replaced with `finally`, because early cancelling | ||
// the stream will create a rejected promise, and finally will create an | ||
// unhandled rejection. | ||
writer.closed.then(onClose, onClose) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We've leveraging the native reader.pipeTo(writer)
now which correctly handles cancellation.
return controller.enqueue(chunk) | ||
} | ||
|
||
const content = decodeText(chunk, textDecoder) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The single layer of abstraction here made it difficult to understand, these are web primatives, we should use them as-is wherever possible.
global.ReadableStream = ReadableStream | ||
// NOTE: any changes to this file should be mirrored in: test/__mocks__/node-polyfill-web-streams.js | ||
|
||
if (process.env.NEXT_RUNTIME !== 'edge') { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This extra wrap ensures that this polyfill doesn't run on edge.
@@ -457,7 +460,7 @@ export function patchFetch({ | |||
cacheKey && | |||
isCacheableRevalidate | |||
) { | |||
const bodyBuffer = Buffer.from(await res.arrayBuffer()) | |||
const bodyBuffer = Buffer.from(await res.clone().arrayBuffer()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reusing a consumed response is prohibited, so we have to clone it first.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's unneeded in this case, because we return a new Response
from the body buffer we create here.
async function getFontForUA(url: string, UA: string): Promise<string> { | ||
const res = await fetch(url, { headers: { 'user-agent': UA } }) | ||
return await res.text() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was too simple to pass up updating 😅
cb33108
to
beb5264
Compare
const encoder = new TextEncoder() | ||
return new TransformStream({ | ||
async transform(chunk, controller) { | ||
const insertedHTMLChunk = encodeText(await getServerInsertedHTML()) | ||
controller.enqueue(insertedHTMLChunk) | ||
controller.enqueue(chunk) | ||
start: async (controller) => { | ||
const html = await getServerInsertedHTML() | ||
controller.enqueue(encoder.encode(html)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks like a behavior change. The old code ensures that if there is new inserted chunk during the stream (after the start
phase), it will still be enqueued. The new code changes that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The default behevior for the TransformStream
without a transform
option provided for the sink is to pass it through without transforming, thereby ensuring that every future chunk will be just passed through.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Never mind! I see, updating :)
beb5264
to
9dba803
Compare
start(controller) { | ||
stream.on('data', (chunk) => | ||
start: async (controller) => { | ||
for await (const chunk of stream) { | ||
controller.enqueue(new KUint8Array([...new Uint8Array(chunk)])) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to construct a Uint8Array
just to clone it into a KUint8Array
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was matching the existing beheviour seen in:
|
||
public async toResponse() { | ||
// If we haven't called `send` yet, wait for it to be called. | ||
if (!this.sent) await this.sendPromise.promise |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: Why not just return a promise, and resolve that promise with the response?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
toResponse
is (as expected) called every request. As soon as you call await
it will pause execution to get the value of the promise (even if it's already resolved). To avoid unnecissary pauses, this check bypasses that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But this is an async function, so this will always have at least 1 tick delay. If you just return a promise resolved to the Response, we keep the same number of overall ticks and (slightly) simplify this method.
} | ||
reader.read().then(processValue) | ||
|
||
;(init as any)._ogBody = arrayBuffer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: Can this be done lazily? We need bodyChunks
to generate the cache string, but do we always need this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The body is used by the patchFetch
as it's used to provide the original response:
body: (init as any)._ogBody || init.body, |
a55ad50
to
7f41b92
Compare
packages/next/src/server/stream-utils/node-web-streams-helper.ts
Outdated
Show resolved
Hide resolved
controller.enqueue(encoder.encode(before)) | ||
|
||
// Flush the remaining part of the buffer. | ||
controller.enqueue(encoder.encode(decoder.decode())) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this be pushing into buf
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
buf
isn't consumed after foundSuffix
is set to true
081c68f
to
6928ffb
Compare
6928ffb
to
c242a5f
Compare
This updates some code related to web streams and encoding.
Buffer
)pull
versus.on("data", (chunk) => { ... })
event handlers (simplifies execution)pipeTo
andpipeThrough
on web streams to remove custom code related to stream pumpingAbortController
when piping so that the response can use it to cancel the stream