-
Notifications
You must be signed in to change notification settings - Fork 270
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable CURL in Playground Web #1935
Conversation
Enables the CURL PHP extension on playground.wordpress.net when networking is enabled. The heavy lifting was done in #1926. All this PR does is: * Enables the curl extension * Rebuilds PHP.wasm for the web * Enables curl_exec and curl_multiexec functions in web browsers * Unrelated – adds a JSPI vs Asyncify indication to the SAPI name so that we can easily learn which PHP.wasm build Playground is running. Related to #85 Closes #1008 ## Testing instrucions Confirm the new E2E tests are sound and that they work in CI. You could also try installing a CURL-reliant plugin such as Plausible and confirm it installs without the fatal errors reported in #1008
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work, Adam! I left some review comments in case they are helpful.
const gzippedChunks: Uint8Array[] = []; | ||
gzip.on('data', (chunk) => { | ||
gzippedChunks.push(chunk); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible that there is only ever one chunk, and if so, would it matter?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is possible and it seems fine – it would push only one chunk. Or do you have some concerns about this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is possible and it seems fine – it would push only one chunk. Or do you have some concerns about this?
Intuitively, I expected testing to involve a response made of multiple chunks and realized that that wasn't guaranteed here. If we don't need to test that here, then it doesn't matter. I still need to finish reading the TLS 1.2 PR and don't have an opinion yet. :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I see! In this test, it only matters that we get a Content Length that represents the compressed data and is inconsistent with the uncompressed data
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for explaining!
const responseBodyPart1 = await reader.read(); | ||
await reader.read(); // Skip the chunk delimiter |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It might be good to confirm that each of these blind reads actually skips what we think it is skipping.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good call! I'm not very happy with this test but I also couldn't come up with a better one on the spot.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's fine and good that the test exists. :) We can always improve things along the way.
* | ||
* Instead of a fixed Content-Length, we'll use Content-Encoding: Chunked, | ||
* and then provide a per-chunk Content-Length. See fetchRawResponseBytes() | ||
* for the details. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great comment!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!
has left any, include some satires that may be published without too | ||
destructive results fifty years hence. He was, I believe, not in the | ||
least an ill-natured man: very much the opposite, I should say; but he | ||
would not suffer fools gladly.`; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I confess that I actually read this (and enjoyed it).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Haha, same!
// enter an infinite loop. Let's disable it completely to | ||
// throw a fatal error instead. | ||
phpIniEntries['disable_functions'] = | ||
'curl_exec,curl_multi_exec'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch.
They are helpful @brandonpayton, thank you for taking the time to review! |
Enables the CURL PHP extension on playground.wordpress.net when networking is enabled. This is made possible by the TLS 1.2 implementation merged in #1926.
This PR:
Related to #85
Closes #1008
Why use Transfer-Encoding: chunked?
Web servers often respond with a combination of Content-Length
and Content-Encoding. For example, a 16kb text file may be compressed
to 4kb with gzip and served with a Content-Encoding of
gzip
and aContent-Length of 4KB.
The web browser, however, exposes neither the Content-Encoding header
nor the gzipped data stream. All we have access to is the original
Content-Length value of the gzipped file and a decompressed data stream.
If we just pass that along to the PHP-side request handler, it would
see a 16KB body stream with a Content-Length of 4KB. It would then
truncate the body stream at 4KB and discard the rest of the data.
This is not what we want.
To correct that behavior, we're stripping the Content-Length entirely.
We do that for every single response because we don't have any way
of knowing whether any Content-Encoding was used. Furthermore, we can't
just calculate the correct Content-Length value without consuming the
entire content stream – and we want to pass each data chunk to PHP
as we receive it.
Instead of a fixed Content-Length, this PR uses Content-Encoding: Chunked,
and then provides a per-chunk Content-Length.
Testing instrucions
Confirm the new E2E tests are sound and that they work in CI. You could also try installing a CURL-reliant plugin such as Plausible and confirm it installs without the fatal errors reported in #1008