Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different behavior between transform function and async generator using pipelines #32363

Closed
Xstoudi opened this issue Mar 19, 2020 · 24 comments
Closed
Labels
help wanted Issues that need assistance from volunteers or PRs that need help to proceed. stream Issues and PRs related to the stream subsystem.

Comments

@Xstoudi
Copy link
Contributor

Xstoudi commented Mar 19, 2020

  • Version: v13.11.0
  • Platform: Windows 10 64-bit
  • Subsystem: stream

What steps will reproduce the bug?

index.mjs:

import { pipeline, PassThrough } from 'stream'

async function* asyncGen(source) {
  for await (const chunk of source) {
    yield chunk;
  }
}

pipeline(process.stdin, asyncGen, process.stdout, err => {
  console.error('end')
})

then echo "hello" | node index.mjs ends up like this:

"hello"
end

but if you replace asyncGen in the pipeline by new PassThrough(), it ends up like this:

"hello"

As you can see, the callback is never called.

How often does it reproduce? Is there a required condition?

What is the expected behavior?

"hello"
end

What do you see instead?

"hello"

Additional information

I don't really know if it's an expected behavior or a bug to be honest. As I can't find anything related in the documentation, it feels like a bug.

@Xstoudi Xstoudi changed the title Different behavior between transform function and async generator in pipelines Different behavior between transform function and async generator using pipelines Mar 19, 2020
@targos targos added the stream Issues and PRs related to the stream subsystem. label Mar 19, 2020
@targos
Copy link
Member

targos commented Mar 19, 2020

@nodejs/streams

@ronag
Copy link
Member

ronag commented Mar 19, 2020

Minimal repro on master.

'use strict';

const common = require('../common');
const assert = require('assert');
const { pipeline, PassThrough } = require('stream');
const cp = require('child_process');

if (process.argv[2] === 'child') {
  pipeline(
    process.stdin,
    new PassThrough(),
    process.stdout,
    common.mustCall((err) => {
      console.error('end');
    })
  );
  return;
}

cp.exec([
  'echo',
  '"hello"',
  '|',
  `"${process.execPath}"`,
  `"${__filename}"`,
  'child'
].join(' '), common.mustCall((err, stdout, stderr) => {
  console.log(err)
  assert.ifError(err);
  console.log(stdout);
}));

@ronag
Copy link
Member

ronag commented Mar 19, 2020

Seems like stdout is never finished, even though the passthrough is writableEnded.

@ronag
Copy link
Member

ronag commented Mar 19, 2020

So this fails:

  pipeline(
    process.stdin,
    new PassThrough(),
    process.stdout,
    common.mustCall((err) => {
      console.error('end');
    })
  );

while this doesn't:

  pipeline(
    process.stdin,
    p.on('end', () => {
      process.stdout.end()
    }),
    process.stdout,
    common.mustCall((err) => {
      console.error('end');
    })
  );

@ronag
Copy link
Member

ronag commented Mar 19, 2020

@addaleax This does not seem related to pipeline or PassThrough it's seems to be something with process.stdout. This fails as well:

'use strict';

const common = require('../common');
const assert = require('assert');

if (process.argv[2] === 'child') {
  process.stdin
    .pipe(process.stdout)
    .on('finish', common.mustCall(() => {
      console.log('finish');
    }));
} else {
  const cp = require('child_process');

  cp.exec([
    'echo',
    '"hello"',
    '|',
    `"${process.execPath}"`,
    `"${__filename}"`,
    'child'
  ].join(' '), common.mustCall((err, stdout, stderr) => {
    assert.ifError(err);
  }));
}

@ronag
Copy link
Member

ronag commented Mar 19, 2020

The reason this probably works with pipeline + generator is because that path does not use pipe.

Maybe pipeline should prefer the 'readable' API when possible over pipe, would that provide better compatibility? @mcollina

@ronag
Copy link
Member

ronag commented Mar 19, 2020

I believe this is the culprit https://github.com/nodejs/node/blob/master/lib/_stream_readable.js#L654. Unfortunately I don't understand this part at the moment.

Basically pipe won't end() the dest if it is process.stdout. Not sure why that is the case. If there is a good reason for that, then maybe pipeline + generators shouldn't do it either?

@vweevers
Copy link
Contributor

If there is a good reason for that

#7606 (comment)

@ronag
Copy link
Member

ronag commented Mar 19, 2020

@vweevers: Does that mean

then maybe pipeline + generators shouldn't do it either?

Do we need a special case here in pipeline to complete once the stream before stdout has emitted 'end'?

@vweevers
Copy link
Contributor

Not sure we can do that? Before completing, we'd want to make sure the writes to stdout have been flushed, but stdout can be synchronous or asynchronous depending on platform and destination.

Alternative point of view: stdout is documented as being different from regular streams. The fact that stdout cannot be closed implies that a pipeline to stdout cannot complete. That's unfortunate, but logically solid.

@ronag
Copy link
Member

ronag commented Mar 19, 2020

If there is a good reason for that

#7606 (comment)

It is a bit random though... since this only applies for the pipe protocol. We have 3 other "read" protocols where this does currently not apply.

The fact that stdout cannot be closed

It can be closed, it's just that pipe doesn't close it.

@ronag
Copy link
Member

ronag commented Mar 19, 2020

Alternative point of view: stdout is documented as being different from regular streams.

I can see that, but I can't see the documentation mentioning this particular thing? Should we add it?

ronag added a commit to nxtedition/node that referenced this issue Mar 19, 2020
stdio (stderr & stdout) should for compatibility
reasons not be closed/end():ed. However, this
causes pipeline with a stdio destination to
never finish. This commit fixes this issue at
a performance cost.

Refs: nodejs#7606

Fixes: nodejs#32363
@vweevers
Copy link
Contributor

Ah, my understanding of it was outdated. Since node v10.12.0 (see #23053 and ERR_STDOUT_CLOSE) you can indeed close these streams. Before they used to throw an error, now they allow it but "secretly" don't close the underlying fd.

I would guess that the pipe behavior was needed to prevent that error. If so, we no longer need that workaround today. Perhaps @addaleax or @mcollina can shed some light on this. In the mean time I'll see if I can piece together the history.

@vweevers
Copy link
Contributor

Confirmed, the 9-year-old commit that introduced the now-defunct throwing behavior also introduced the pipe workaround: 13324bf

@ronag
Copy link
Member

ronag commented Mar 20, 2020

@vweevers: So is a possible fix to make pipeline actually call end() on the stdio stream?

src.pipe(dst);
if (dst === process.stdout) {
  src.on('end', () => dst.end());
}

@vweevers
Copy link
Contributor

I think so, but because there's so much history here, let's give the mentioned folks some time to respond.

@targos
Copy link
Member

targos commented Apr 6, 2020

So what should we do about this?

@targos targos added the help wanted Issues that need assistance from volunteers or PRs that need help to proceed. label Apr 6, 2020
@vweevers
Copy link
Contributor

vweevers commented Apr 6, 2020

As far as I can tell, because of #23053, we no longer need the 2nd and 3rd lines here:

const doEnd = (!pipeOpts || pipeOpts.end !== false) &&
dest !== process.stdout &&
dest !== process.stderr;

That would fix the issue.

@Xstoudi
Copy link
Contributor Author

Xstoudi commented Apr 6, 2020

I can PR a delete of the 2nd and 3rd lines if that's all it takes to solve.

@ronag
Copy link
Member

ronag commented Apr 6, 2020

@Xstoudi seems there is a little more to it unfortunately, #32373 (comment)

@ronag ronag closed this as completed in 98b6b2d Apr 6, 2020
@vweevers
Copy link
Contributor

vweevers commented Apr 6, 2020

@ronag Do we not want to fix this for pipe()?

@ronag
Copy link
Member

ronag commented Apr 6, 2020

@ronag Do we not want to fix this for pipe()?

Was closed through the commit. I think that's a different issue than this though.

But yes, I think we should fix it in pipe() as well. pipeline should still have the the workaround though to support older versions of streams from e.g. readable-stream.

@vweevers
Copy link
Contributor

vweevers commented Apr 6, 2020

pipeline should still have the the workaround though to support older versions of streams from e.g. readable-stream.

Can we safely do that? Because we would be calling .end() twice.

@ronag
Copy link
Member

ronag commented Apr 6, 2020

Can we safely do that? Because we would be calling .end() twice.

I think calling end twice is safe in this case.

BethGriggs pushed a commit that referenced this issue Apr 7, 2020
stdio (stderr & stdout) should for compatibility
reasons not be closed/end():ed. However, this
causes pipeline with a stdio destination to
never finish. This commit fixes this issue at
a performance cost.

Refs: #7606

Fixes: #32363

PR-URL: #32373
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
targos pushed a commit that referenced this issue Apr 12, 2020
stdio (stderr & stdout) should for compatibility
reasons not be closed/end():ed. However, this
causes pipeline with a stdio destination to
never finish. This commit fixes this issue at
a performance cost.

Refs: #7606

Fixes: #32363

PR-URL: #32373
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Issues that need assistance from volunteers or PRs that need help to proceed. stream Issues and PRs related to the stream subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants