Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CompressionStream hangs #51728

Open
davewasmer opened this issue Feb 12, 2024 · 2 comments
Open

CompressionStream hangs #51728

davewasmer opened this issue Feb 12, 2024 · 2 comments

Comments

@davewasmer
Copy link

davewasmer commented Feb 12, 2024

Version

v21.6.1

Platform

Darwin ****** 23.0.0 Darwin Kernel Version 23.0.0: Fri Sep 15 14:42:42 PDT 2023; root:xnu-10002.1.13~1/RELEASE_X86_64 x86_64

Subsystem

globals, streams/web

What steps will reproduce the bug?

function from(src) {
  return new ReadableStream({
    start(controller) {
      controller.enqueue(src);
      controller.close();
    },
  });
}

async function read(stream) {
  const reader = stream.getReader();
  const chunks = [];
  while (true) {
    const { done, value } = await reader.read();
    if (done) break;
    chunks.push(value);
  }
  return chunks;
}

await read(from(new ArrayBuffer([0, 0, 0, 0])).pipeThrough(new CompressionStream('gzip')));

How often does it reproduce? Is there a required condition?

Every time

What is the expected behavior? Why is that the expected behavior?

Should resolve to [Uint8Array(10), Uint8Array(10)].

What do you see instead?

Nothing - reader.read() hangs and never resolves.

Additional information

Running the code snippet in Chrome devtools console produces the correct outcome. Using 'deflate' rather than 'gzip' has no effect, it still hangs.

@IlyasShabi
Copy link
Contributor

I explored the issue and it appears that Nodejs expects the "chunk" argument in the _write function to be either a string, Buffer, or Uint8Array, not an ArrayBuffer.

A potential workaround involves converting the ArrayBuffer to a Buffer before processing it:

const buffer = Buffer.from(new ArrayBuffer([0, 0, 0, 0]));

@nodejs/streams I experimented with converting ArrayBuffer to Buffer directly within the _write function like this:

if (Stream._isAnyArrayBuffer(chunk)) {
    encoding = 'buffer';
    chunk = Buffer.from(chunk, encoding);
}

It worked, but I'm not convinced it's the best way to add support ArrayBuffer

@marco-ippolito
Copy link
Member

Hi @IlyasShabi would you like to open a PR with your solution? I think it's worth giving it a try

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants