Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Buffer.from(str) memory leak #38300

Closed
Daninet opened this issue Apr 19, 2021 · 32 comments
Closed

Buffer.from(str) memory leak #38300

Daninet opened this issue Apr 19, 2021 · 32 comments
Labels
buffer Issues and PRs related to the buffer subsystem. memory Issues and PRs related to the memory management or memory footprint.

Comments

@Daninet
Copy link

Daninet commented Apr 19, 2021

  • Version: v14.16.1, v15.14.0, v16.0.0
  • Platform: Microsoft Windows NT 10.0.19042.0 x64
  • Subsystem: Buffer

What steps will reproduce the bug?

Run the following code with node --expose-gc

const str = 'x'.repeat(10000);

console.log('start', process.memoryUsage());

for(let i = 0; i < 1e6; i++) {
  Buffer.from(str); // memory leak happens here
  if (i % 1e5 === 0) {
    gc();
    console.log('step', i, 'rss', process.memoryUsage().rss);
  }
}

gc();
console.log('end', process.memoryUsage());

How often does it reproduce? Is there a required condition?

It reproduces consistently with v14.16.1 and v15.14.0.
It only reproduces with a string parameter. Arrays are not causing issues.
I also tested some older versions (v10.22.0, v12.8.1) and the leak is not present there.

What is the expected behavior?

Created buffers should be garbage collected, and memory usage shouldn't increase too much during execution.

What do you see instead?

Memory usage increases to over 10 GB with the supplied code snippet. I get OOM errors when I increase the number of iterations.

start {
  rss: 18243584,
  heapTotal: 4055040,
  heapUsed: 3168160,
  external: 260263,
  arrayBuffers: 10778
}
step 0 rss 19763200
step 100000 rss 1078853632
step 200000 rss 2133364736
step 300000 rss 3189673984
step 400000 rss 4241575936
step 500000 rss 5293998080
step 600000 rss 6354276352
step 700000 rss 7406329856
step 800000 rss 8458436608
step 900000 rss 9510563840
end {
  rss: 10562609152,
  heapTotal: 5980160,
  heapUsed: 3170888,
  external: 21400146,
  arrayBuffers: 10054
}
@Ayase-252 Ayase-252 added the memory Issues and PRs related to the memory management or memory footprint. label Apr 19, 2021
@Ayase-252
Copy link
Member

Ayase-252 commented Apr 19, 2021

rss seems stabilized around 5G in my machine (MacOS).

out/Release/node -v                    
v16.0.0-pre


out/Release/node --expose-gc index-2.js
start {
  rss: 21245952,
  heapTotal: 4050944,
  heapUsed: 3169480,
  external: 253370,
  arrayBuffers: 10762
}
step 0 rss 22884352
step 100000 rss 1076359168
step 200000 rss 2127024128
step 300000 rss 3178852352
step 400000 rss 4227698688
step 500000 rss 4917829632
step 600000 rss 4834082816
step 700000 rss 4781207552
step 800000 rss 4714610688
step 900000 rss 4624203776
end {
  rss: 4560502784,
  heapTotal: 5894144,
  heapUsed: 3159872,
  external: 21382689,
  arrayBuffers: 10038
}

@bytemain
Copy link

Macos 11

nodejs 14:

start {
  rss: 20209664,
  heapTotal: 4206592,
  heapUsed: 2560312,
  external: 828505,
  arrayBuffers: 9898
}
step 0 rss 21798912
step 100000 rss 1076994048
step 200000 rss 2127327232
step 300000 rss 3179413504
step 400000 rss 4227592192
step 500000 rss 4739432448
step 600000 rss 4694577152
step 700000 rss 4600496128
step 800000 rss 4497977344
step 900000 rss 4408086528
end {
  rss: 4287864832,
  heapTotal: 7929856,
  heapUsed: 2226328,
  external: 37181504,
  arrayBuffers: 9898
}

nodejs 12:

start {
  rss: 19390464,
  heapTotal: 4644864,
  heapUsed: 2476832,
  external: 782967,
  arrayBuffers: 9386
}
step 0 rss 21192704
step 100000 rss 128675840
step 200000 rss 141754368
step 300000 rss 138477568
step 400000 rss 141754368
step 500000 rss 129318912
step 600000 rss 133066752
step 700000 rss 144846848
step 800000 rss 141770752
step 900000 rss 153034752
end {
  rss: 147271680,
  heapTotal: 7581696,
  heapUsed: 2168520,
  external: 933915,
  arrayBuffers: 53559386
}

@rensrongen
Copy link

I experience this issue running Node v15.14.0 on Ubuntu 20.04 (LTS) x64.

Basically, I have a long-running server process that runs out of memory and crashes periodically. It makes frequent Buffer.from() method calls.

The problem started after upgrading to v15.14.0.

@Ayase-252
Copy link
Member

Ayase-252 commented Apr 20, 2021

It seems the issue did not happen in v14.0.0. I would do bisect to look which change is suspicious.

v14.0.0

node --expose-gc index.js
start {
  rss: 19959808,
  heapTotal: 4149248,
  heapUsed: 2313384,
  external: 764993,
  arrayBuffers: 9382
}
step 0 rss 21532672
step 100000 rss 127037440
step 200000 rss 129449984
step 300000 rss 141750272
step 400000 rss 133693440
step 500000 rss 136900608
step 600000 rss 135507968
step 700000 rss 128778240
step 800000 rss 133877760
step 900000 rss 130506752
end {
  rss: 132894720,
  heapTotal: 7610368,
  heapUsed: 2092368,
  external: 925930,
  arrayBuffers: 51169382
}

@Ayase-252 Ayase-252 added buffer Issues and PRs related to the buffer subsystem. confirmed-bug Issues with confirmed bugs. labels Apr 20, 2021
@rensrongen
Copy link

rensrongen commented Apr 20, 2021

FWIW the issue was not present in v15.11.0 from which I upgraded.

@Daninet
Copy link
Author

Daninet commented Apr 20, 2021

I just tested it, node.js v15.11.0 / Windows 10 / x64 leaks memory on my computer with the supplied code.

@Linkgoron
Copy link
Member

I'm seeing a relatively large change between v14.4 and 14.5

@Daninet
Copy link
Author

Daninet commented Apr 20, 2021

I can confirm, it was broken with v14.5.0.
v14.4.0 works fine.

@XadillaX
Copy link
Contributor

Try this code?

const str = 'x'.repeat(10000);

console.log('start', process.memoryUsage());

function wait(ms) {
  let resolve;
  const promise = new Promise((_resolve, _reject) => { resolve = _resolve; });

  setTimeout(() => {
    resolve();
  }, ms);

  return promise;
}

(async () => {
  for(let i = 0; i < 1e6; i++) {
    Buffer.from(str); // memory leak happens here
    if (i % 1e5 === 0) {
      await wait(100);
      gc();
      console.log('step', i, 'rss', process.memoryUsage().rss);
    }
  }
})().then(() => {
  gc();
  console.log('end', process.memoryUsage());
});

@bytemain
Copy link

v14.16.0

XadillaX's code:

start {
  rss: 20185088,
  heapTotal: 4206592,
  heapUsed: 2555408,
  external: 828505,
  arrayBuffers: 9898
}
step 0 rss 21876736
step 100000 rss 251977728
step 200000 rss 279126016
step 300000 rss 287125504
step 400000 rss 297897984
step 500000 rss 304611328
step 600000 rss 257728512
step 700000 rss 280563712
step 800000 rss 270651392
step 900000 rss 264871936
end {
  rss: 1112817664,
  heapTotal: 7667712,
  heapUsed: 2237200,
  external: 33190872,
  arrayBuffers: 9898
}

@Daninet
Copy link
Author

Daninet commented Apr 21, 2021

Yeah, I does not happen if the control goes back to the event loop. But still, it's not a solution, given that a lot of libraries / code rely on synchronous processing of buffers. It would be a serious disadvantage of using Buffers compared to Uint8Arrays.

@XadillaX
Copy link
Contributor

I think it's because of this ee6ec14. This commit was using ArrayBuffer / BackingStore to instead of uv_buf_t in Buffer's inner data.

Using ee6ec14:

start {
  rss: 33239040,
  heapTotal: 4153344,
  heapUsed: 2344464,
  external: 779835,
  arrayBuffers: 9382
}
step 0 rss 34242560
step 100000 rss 1061609472
step 200000 rss 2083704832
step 300000 rss 3105091584
step 400000 rss 4129398784
step 500000 rss 5150466048
step 600000 rss 6171836416
step 700000 rss 7192920064
step 800000 rss 8220708864
step 900000 rss 9241792512
end {
  rss: 10263416832,
  heapTotal: 7614464,
  heapUsed: 2105888,
  external: 23539770,
  arrayBuffers: 9382
}

Using it's parent 39f42d1:

start {
  rss: 33226752,
  heapTotal: 4153344,
  heapUsed: 2344296,
  external: 779835,
  arrayBuffers: 9382
}
step 0 rss 34349056
step 100000 rss 114434048
step 200000 rss 115458048
step 300000 rss 117305344
step 400000 rss 114778112
step 500000 rss 114855936
step 600000 rss 114880512
step 700000 rss 117583872
step 800000 rss 115372032
step 900000 rss 115372032
end {
  rss: 115531776,
  heapTotal: 7614464,
  heapUsed: 2105888,
  external: 43969770,
  arrayBuffers: 42049382
}

And I found the parent commit is much faster than the refactored one.

Shall we change it back to libuv's Buffer?

@Daninet
Copy link
Author

Daninet commented Apr 21, 2021

It looks like that the length of the string is also important.
When str.length < 4096 there is no leak.

@XadillaX
Copy link
Contributor

bind to: #33291

@XadillaX
Copy link
Contributor

bind to: #33381

@cjk
Copy link

cjk commented Apr 21, 2021

From my tests + experience, v15.12.x still runs stable (no crashes), but newer releases than that crash periodically (mostly SIGV) - so not sure this is related to this particular issue.

@Ayase-252
Copy link
Member

Ayase-252 commented Apr 21, 2021

It seems that memory regression is introduced after #33291, (e2cd715 specifically) on master branch, as @XadillaX mentioned.

Result its previous commit
➜  node git:(56ff1ee55a) ✗ out/Release/node --expose-gc index.js 
start {
  rss: 19697664,
  heapTotal: 4153344,
  heapUsed: 2329880,
  external: 780709,
  arrayBuffers: 9382
}
step 0 rss 21311488
step 100000 rss 138866688
step 200000 rss 141926400
step 300000 rss 159461376
step 400000 rss 150708224
step 500000 rss 161411072
step 600000 rss 159981568
step 700000 rss 159526912
step 800000 rss 160628736
step 900000 rss 168017920
end {
  rss: 153833472,
  heapTotal: 7614464,
  heapUsed: 2114208,
  external: 70921776,
  arrayBuffers: 67289382
}

e2cd715

➜  node git:(e2cd715361) ✗ out/Release/node --expose-gc index.js
start {
  rss: 19668992,
  heapTotal: 4153344,
  heapUsed: 2357168,
  external: 780709,
  arrayBuffers: 9382
}
step 0 rss 21192704
step 100000 rss 1075273728
step 200000 rss 2125484032
step 300000 rss 3177619456
step 400000 rss 3333910528
step 500000 rss 3327348736
step 600000 rss 3313135616
step 700000 rss 3241398272
step 800000 rss 3112476672
step 900000 rss 3484286976
end {
  rss: 3680956416,
  heapTotal: 7614464,
  heapUsed: 2111816,
  external: 31201776,
  arrayBuffers: 9382
}

cc @jasnell @addaleax Could you please take a look?

@jasnell
Copy link
Member

jasnell commented Apr 21, 2021

It looks like a possible combination of that change and c1ee70e. The cleanup callbacks are not being called until the node.js process exits. Investigating. @addaleax ... any thoughts?

@XadillaX
Copy link
Contributor

It looks like a combination of that change and c1ee70e. The cleanup callbacks are not being called until the node.js process exits. Investigating.

I'll have a look tomorrow daytime.

@jasnell
Copy link
Member

jasnell commented Apr 21, 2021

I don't believe that #33291 and e2cd715 are the issue. Running the test case here, I don't see AllocatedBuffer actually being used here. The underlying data buffer is allocated using UncheckedMalloc() and passed through to create an ArrayBuffer with a deleter. The problem appears to be that, for whatever reason, the cleanup callback is not being called until process exit, causing the memory leak here. @addaleax ... you may have a much better idea what's happening here.

When str.length < 4096 there is no leak.

That's because the code path that introduces the leak is not invoked for sizes < 4096. This further confirms that it's likely c1ee70e that's the culprit here.

Still, it's not entirely clear what is happening here. There's definitely something a bit off here.

@jasnell
Copy link
Member

jasnell commented Apr 21, 2021

Actually, scratch that. The issue is the the for loop here...

const str = 'x'.repeat(10000);

console.log('start', process.memoryUsage());

for(let i = 0; i < 1e6; i++) {
  Buffer.from(str); // memory leak happens here
  if (i % 1e5 === 0) {
    gc();
    console.log('step', i, 'rss', process.memoryUsage().rss);
  }
}

gc();
console.log('end', process.memoryUsage());

The Buffers are being allocated in a synchronous for loop. For every Buffer, this allocated a "tracked" v8::BackingStore whose memory is allocated outside of the v8 heap. When the garbage collector kicks in, an internal callback is fired that we use to schedule the cleanup and deallocation logic. We use a native level SetImmediate() function here that defers that cleanup until after the synchronously executing JavaScript is done running. The for loop, however, blocks the event loop from turning so the cleanup logic is being delayed. Once the event loop is allowed to turn, the memory is freed and no more leak. That said, the rss remains high because it can take some time for the OS to reclaim the process memory.

Bottom line is... there's not actually a problem here from Node.js' point of view. Everything is working as it should. The challenge is that the memory management here is not very intuitive. The gc() cleans up the memory that v8 knows about but the externally allocated storage for the v8::BackingStore is deferred and delayed by the synchronous javascript.

@jasnell jasnell removed the confirmed-bug Issues with confirmed bugs. label Apr 21, 2021
@Daninet
Copy link
Author

Daninet commented Apr 21, 2021

I also dug into it.

node/src/node_buffer.cc

Lines 184 to 199 in 053aa6d

void CallbackInfo::OnBackingStoreFree() {
// This method should always release the memory for `this`.
std::unique_ptr<CallbackInfo> self { this };
Mutex::ScopedLock lock(mutex_);
// If callback_ == nullptr, that means that the callback has already run from
// the cleanup hook, and there is nothing left to do here besides to clean
// up the memory involved. In particular, the underlying `Environment` may
// be gone at this point, so don’t attempt to call SetImmediateThreadsafe().
if (callback_ == nullptr) return;
env_->SetImmediateThreadsafe([self = std::move(self)](Environment* env) {
CHECK_EQ(self->env_, env); // Consistency check.
self->CallAndResetCallback();
});
}

OnBackingStoreFree() is being called synchronously, but there is a SetImmediateThreadsafe() which postpones the deallocation after the sync for loop is finished. If I remove the SetImmediateThreadsafe() and call the delete callback synchronously the leak disappears. Isn't there a solution to avoid the setImmediate() while keeping the c1ee70e fix in place?

@addaleax
Copy link
Member

@Daninet Unfortunately, the existing native API contract (i.e. towards addons and embedders) for Buffers is that their callback is a) called on the original thread where the Buffer was created, and b) that it is generally possible to run JS code in that callback.

That requires some kind of asynchronous solution, at least in the general case. We could also change the API contract in a semver-major change, but it’s hard to communicate the changed API to addons (which are often unmaintained, esp. those not using N-API, which is what this is about).

I think for this particular issue, we could avoid using UncheckedMalloc() and instead allocate the backing store directly, if that helps.

@jasnell
Copy link
Member

jasnell commented Apr 21, 2021

I think for this particular issue, we could avoid using UncheckedMalloc() and instead allocate the backing store directly, if that helps.

Yeah I was thinking the same, although I'm not entirely convinced it would make much difference. We'd free the memory up a bit sooner, sure, but the test case is pretty pathological anyway. I'm not convinced there's a lot of justification to optimize for it.

@Daninet
Copy link
Author

Daninet commented Apr 21, 2021

It's imaginable that some people run into this issue while doing benchmarks or data science from fully synchronous JS. Also it can cause memory spikes leading to OOM crash on node.js backends running low on RAM. But I see, it's tricky to fix...

Wouldn't make sense to add some words to the documentation about this async GC behavior, if it's not being fixed?

@jasnell
Copy link
Member

jasnell commented Apr 21, 2021

Documentation would be good, for sure. It's tricky tho because it's a fairly complicated issue that doesn't apply to the majority of situations. Will give some thought on how to document that and always open to suggestions! :-) ... on whether we change this is still up in the air. @addaleax's suggestion should work and should be fairly simple -- and I certainly would not block it! I'm just thinking that there's better ways of addressing that problem. For instance, it would be conceivable to allocate a large slab of memory at the start of the benchmark or data analysis and simply slice off from it as needed -- effectively doing your own memory management rather than repeatedly allocating and freeing and relying on gc behavior in a tight loop like this -- it would be much more efficient at the cost of some increased code complexity.

@Daninet
Copy link
Author

Daninet commented Apr 21, 2021

If somebody is aware of this issue, then surely it can be avoided and there are a lot of ways for it: more efficient algorithms, TextEncoder or even doing async sleeps to enable GC-ing the buffers.

The problematic part relies in the unexpected and undocumented behavior, which does not occur with Buffer.from(arr), Buffer.from(shortStr), Buffer.alloc(num). Or at least, I could not reproduce it with anything other than strings with length >= 4096.

jasnell added a commit to jasnell/node that referenced this issue Apr 21, 2021
Signed-off-by: James M Snell <jasnell@gmail.com>
Refs: nodejs#38300
jasnell added a commit to jasnell/node that referenced this issue Apr 21, 2021
Previously, the code path would allocated a tracked ArrayBuffer
that defers cleanup and deallocation of the underlying data with
a SetImmediate. Avoid the unnecessary deferral by just allocating
a `BackingStore` directly and writing into it.

Fixes: nodejs#38300
Refs: nodejs#38336
@jasnell
Copy link
Member

jasnell commented Apr 21, 2021

Ok, I have two possible approaches and two PRs here.

The first is just a documentation fix in #38336

The second in #38337 follows @addaleax's suggestion to avoid the UncheckedMalloc() entirely... which does work and yields a nice improvement to this particular test:

start {
  rss: 33652736,
  heapTotal: 4050944,
  heapUsed: 3165544,
  external: 253370,
  arrayBuffers: 10762
}
step 0 rss 35561472
step 100000 rss 72196096
step 200000 rss 72196096
step 300000 rss 72196096
step 400000 rss 72196096
step 500000 rss 72196096
step 600000 rss 72196096
step 700000 rss 72593408
step 800000 rss 72593408
step 900000 rss 72593408
end {
  rss: 72593408,
  heapTotal: 5894144,
  heapUsed: 3161104,
  external: 27063474,
  arrayBuffers: 22010038
}

@XadillaX
Copy link
Contributor

Ok, I have two possible approaches and two PRs here.

The first is just a documentation fix in #38336

The second in #38337 follows @addaleax's suggestion to avoid the UncheckedMalloc() entirely... which does work and yields a nice improvement to this particular test:

start {
  rss: 33652736,
  heapTotal: 4050944,
  heapUsed: 3165544,
  external: 253370,
  arrayBuffers: 10762
}
step 0 rss 35561472
step 100000 rss 72196096
step 200000 rss 72196096
step 300000 rss 72196096
step 400000 rss 72196096
step 500000 rss 72196096
step 600000 rss 72196096
step 700000 rss 72593408
step 800000 rss 72593408
step 900000 rss 72593408
end {
  rss: 72593408,
  heapTotal: 5894144,
  heapUsed: 3161104,
  external: 27063474,
  arrayBuffers: 22010038
}

After testing:

const str = 'x'.repeat(10000);

console.log('start', process.memoryUsage());

console.time('1');
  for(let i = 0; i < 1e6; i++) {
    Buffer.from(str); // memory leak happens here
    if (i % 1e5 === 0) {
      // await wait(100);
      gc();
      console.log('step', i, 'rss', process.memoryUsage().rss);
    }
  }
console.timeEnd('1');

  gc();
  console.log('end', process.memoryUsage());

The new PR #38337 runs about 3.9s in my computer. And 39f42d1a2d (before using BackingStore) runs about 3.7s. I think this performance loss is acceptable.

@Daninet
Copy link
Author

Daninet commented Apr 22, 2021

I like the #38337, it fixes the memory leak issue. On my machine it finishes the loop in 7.9s. Master (with memory leak) is doing the same in 9.3s.

@addaleax
Copy link
Member

I think the “performance loss” is really just the deallocations happening earlier.

@jasnell
Copy link
Member

jasnell commented Apr 22, 2021

I think the “performance loss” is really just the deallocations happening earlier.

Yes, that's precisely what it is. Overall I still suggest that the code sample as-is is problematic and a different approach should be taken.

@targos targos closed this as completed in 720fdf2 Apr 24, 2021
targos pushed a commit that referenced this issue Apr 29, 2021
Previously, the code path would allocated a tracked ArrayBuffer
that defers cleanup and deallocation of the underlying data with
a SetImmediate. Avoid the unnecessary deferral by just allocating
a `BackingStore` directly and writing into it.

Fixes: #38300
Refs: #38336

PR-URL: #38337
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Khaidi Chu <i@2333.moe>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com>
targos pushed a commit that referenced this issue May 30, 2021
Previously, the code path would allocated a tracked ArrayBuffer
that defers cleanup and deallocation of the underlying data with
a SetImmediate. Avoid the unnecessary deferral by just allocating
a `BackingStore` directly and writing into it.

Fixes: #38300
Refs: #38336

PR-URL: #38337
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Khaidi Chu <i@2333.moe>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com>
targos pushed a commit that referenced this issue Jun 5, 2021
Previously, the code path would allocated a tracked ArrayBuffer
that defers cleanup and deallocation of the underlying data with
a SetImmediate. Avoid the unnecessary deferral by just allocating
a `BackingStore` directly and writing into it.

Fixes: #38300
Refs: #38336

PR-URL: #38337
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Khaidi Chu <i@2333.moe>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com>
targos pushed a commit that referenced this issue Jun 11, 2021
Previously, the code path would allocated a tracked ArrayBuffer
that defers cleanup and deallocation of the underlying data with
a SetImmediate. Avoid the unnecessary deferral by just allocating
a `BackingStore` directly and writing into it.

Fixes: #38300
Refs: #38336

PR-URL: #38337
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Khaidi Chu <i@2333.moe>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com>
liuxingbaoyu added a commit to liuxingbaoyu/node that referenced this issue Apr 11, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: nodejs#40828
Refs: nodejs#38300
liuxingbaoyu added a commit to liuxingbaoyu/node that referenced this issue Apr 11, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: nodejs#40828
Refs: nodejs#38300
nodejs-github-bot pushed a commit that referenced this issue May 2, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: #40828
Refs: #38300

PR-URL: #42695
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Minwoo Jung <nodecorelab@gmail.com>
RafaelGSS pushed a commit that referenced this issue May 10, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: #40828
Refs: #38300

PR-URL: #42695
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Minwoo Jung <nodecorelab@gmail.com>
juanarbol pushed a commit that referenced this issue May 31, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: #40828
Refs: #38300

PR-URL: #42695
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Minwoo Jung <nodecorelab@gmail.com>
danielleadams pushed a commit that referenced this issue Jun 27, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: #40828
Refs: #38300

PR-URL: #42695
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Minwoo Jung <nodecorelab@gmail.com>
targos pushed a commit that referenced this issue Jul 12, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: #40828
Refs: #38300

PR-URL: #42695
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Minwoo Jung <nodecorelab@gmail.com>
targos pushed a commit that referenced this issue Jul 31, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: #40828
Refs: #38300

PR-URL: #42695
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Minwoo Jung <nodecorelab@gmail.com>
guangwong pushed a commit to noslate-project/node that referenced this issue Oct 10, 2022
When Buffer::New passes in existing data,
it cannot be garbage collected in js synchronous execution.

Fixes: nodejs/node#40828
Refs: nodejs/node#38300

PR-URL: nodejs/node#42695
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Minwoo Jung <nodecorelab@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
buffer Issues and PRs related to the buffer subsystem. memory Issues and PRs related to the memory management or memory footprint.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants