-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vulnerability: Memory Overflow when sending big files #5412
Comments
There are several open issues and discussions around this topic. I think the core team is currently working on improving this. Endpoints just recently received the ability to get a ´ReadableStream´ from a request which you can use to read files uploaded chunk by chunk without filling your memory. But again there is not yet a "recommended" (as present in the documentation) way to do this. |
I would guess the issue is here: kit/packages/kit/src/node/index.js Lines 21 to 43 in 82c57b4
We're blinding enqueueing data as soon as it's available — we probably need to use |
(I have to add that it brings me great pleasure that someone going by the handle @UnlimitedBytes is responsible for this particular issue) |
Answer to #5412 (comment)
Hey, first thanks for the fast response.
All the above listed issues/pull requests are related to this issue but already closed. Also they all describe how it should be implemented or implement request body / response body streaming. All of them are good on its own but don't directly explain the issue that the body gets parsed before there is an possibility to stream it. |
Answer to #5412 (comment)
This looks like the right place as the linked code will get generated to the following inside if (length > 0) {
let offset = 0;
req.on('data', (chunk) => {
const new_len = offset + Buffer.byteLength(chunk);
if (new_len > length) {
return reject({
status: 413,
reason: 'Exceeded "Content-Length" limit'
});
}
data.set(chunk, offset);
offset = new_len;
});
} else {
req.on('data', (chunk) => {
const new_data = new Uint8Array(data.length + chunk.length);
new_data.set(data, 0);
new_data.set(chunk, data.length);
data = new_data;
});
}
req.on('end', () => {
fulfil(data);
}); which when commented out will not trigger the issue and the node process will only run on around Anyway the suggested fix using |
AT THIS POINT THIS IS A MAJOR ISSUE WHICH MAKES EVERY SVELTEKIT SITE VULNERABLE!After some more researching on this issue I updated the title to reflect the importance of this issue. As Due to the fact that this bug is not related to a specific adapter (only The bug can be mitigated by services like Vercel, CloudFlare, Netlify, etc. because they spin up multiple server instances for an app. Anyway the server that's handling the specific request will fail and die tho. For any deployment solution where no Serveless Plattform at planet scale is involved this crashes the entire underlying server when enough data gets send! |
UPDATE
I updated the title and added some more information on the importance of this issue here: #5412 (comment)
Describe the bug
⚠ Disclaimer
First of all I'm not a native english speaker so please forgive my sometimes bad english! ❤
Second of all I'm not entirely sure if this issue belongs to
sveltejs/kit
or more tosveltejs/kit/adapter-node
but as they both are on the same repository I'm posting this issue here anyway.Third of all I know there may be people saying this is more of a feature request than a bug. But in my opinion it's a bug as it's blocking me from using sveltekit in this project.
🐞 The bug
Sveltekit processes any
multipart/form-data
that's hitting an POST endpoint completly in RAM before reaching out to the endpoint. This let's me and anyone who wants to enable file uploading run into 2 related but different problems.🌟A solution
I don't know if this is acceptable or not (maybe not because it will most likely introduce a breaking change) but a very simple solution for this problem would be to let the user decide if the body should be parsed for them. This in my opinion is the "cleanest way" of fixing this. As we already have the
request
.formData()
,.text()
,.json()
, etc. functions in the Web APIs Standard.Little side fact to this: first I thought my code was doing this behaviour because I used a
.formData()
which for me was the place to do such a thing but currently it's doing it regardless of the existence of it.Reproduction
Skeleton Project
A skeleton project which generates a ~1GiB big file inside the browser and sends it to the node server can be found on: https://github.com/UnlimitedBytes/sveltekit-issue-overflow-formdata
Can't recommend
Anyway I do not recommend using this to test the issue a much better way (which won't kill your browser ram too) would be to generate a random file on your os and upload it via a post form!
Better way
To generate a random file with ~1GiB on different operating systems use:
A simple form to upload a file should look like so:
EDIT: This was a false assumption it is not needed to do anything with the data you can directly dump the data to the garbage collector! This only will hold the data some time longer for better visibility but the RAM usage will quickly increase anyway until the garbage collector kicks in.
In order to trigger the issue a POST endpoint handler is needed! So create an
upload.js
and/or anupload.svelte
file with the followingupload.js
content:Logs
![windows task manager showing node process on around one GiB RAM usage](https://i.imgur.com/Jmts2Wz.png)
System Info
Severity
blocking all usage of SvelteKitMAJOR - SECRUITY ISSUE
Additional Information
No response
The text was updated successfully, but these errors were encountered: