Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix MultipartReader for big files #4865

Merged
merged 16 commits into from
Apr 29, 2020
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 9 additions & 9 deletions std/mime/multipart.ts
Original file line number Diff line number Diff line change
Expand Up @@ -281,10 +281,10 @@ export class MultipartReader {
* null value means parsing or writing to file was failed in some reason.
* @param maxMemory maximum memory size to store file in memory. bytes. @default 1048576 (1MB)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ry go uses 10MB as default in readForm. Yet the comment in the code says 1MB, just want confirmation. I think 1MB it's a good value to keep in Buffer.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should definitely follow Go's example. 10MB sounds good to me.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Reverted back to 10MB and updated comment.

* */
async readForm(maxMemory = 10 << 20): Promise<MultipartFormData> {
async readForm(maxMemory = 1 << 20): Promise<MultipartFormData> {
const fileMap = new Map<string, FormFile>();
const valueMap = new Map<string, string>();
let maxValueBytes = maxMemory + (10 << 20);
let maxValueBytes = maxMemory + (1 << 20);
const buf = new Buffer(new Uint8Array(maxValueBytes));
for (;;) {
const p = await this.nextPart();
Expand All @@ -308,22 +308,22 @@ export class MultipartReader {
}
// file
let formFile: FormFile | undefined;
const n = await copy(buf, p);
const n = await copyN(buf, p, maxValueBytes);
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While testing I noticed that when parsing big files, it was crashing with: error: Uncaught Error: The buffer cannot be grown beyond the maximum size.

This happens because copy was being used instead of copyN, and was trying to fill the buffer with the whole file.

const contentType = p.headers.get("content-type");
assert(contentType != null, "content-type must be set");
if (n > maxMemory) {
if (n >= maxValueBytes) {
// too big, write to disk and flush buffer
const ext = extname(p.fileName);
const { file, filepath } = await tempFile(".", {
prefix: "multipart-",
postfix: ext,
});
try {
const size = await copyN(
file,
new MultiReader(buf, p),
maxValueBytes
);
// write buffer to file
let size = await copyN(file, buf, n);
// Write the rest of the file
size += await copy(file, new MultiReader(buf, p));

file.close();
formFile = {
filename: p.fileName,
Expand Down