-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible memory leak in TLS(Wrap?) #1075
Comments
I wonder if there is a handle leak. I noticed that TLSWrap::OnReadSelf() creates a JS buffer object without a HandleScope. Does this patch make a difference? diff --git a/src/tls_wrap.cc b/src/tls_wrap.cc
index 1647ab6..2b00d9e 100644
--- a/src/tls_wrap.cc
+++ b/src/tls_wrap.cc
@@ -641,6 +641,7 @@ void TLSWrap::OnReadSelf(ssize_t nread,
uv_handle_type pending,
void* ctx) {
TLSWrap* wrap = static_cast<TLSWrap*>(ctx);
+ HandleScope handle_scope(wrap->env()->isolate());
Local<Object> buf_obj;
if (buf != nullptr)
buf_obj = Buffer::Use(wrap->env(), buf->base, buf->len); |
Providing a tiny bit of additional context: this is a graph of memory usage across 6 instances of our application from a few weeks back (4x node v0.10.3?, 1x node v0.12.0 in orange, 1x iojs v1.3.0 in red). Continuing to see the same behavior in 1.4.x. The application is a very simple proxy, seeing highs of ~350 reqs/sec though more consistently ~80-100. Each incoming request results in 1 to 4 outbound requests. |
Looking. |
It seems to be leaking |
Ensure that no handles will leak into global HandleScope by adding HandleScope's in all JS-calling libuv callbacks in `stream_wrap.cc`. Fix: nodejs#1075
Partial fix here: #1078 |
I've got a test here that reproduces the problem, comparing HTTP and HTTPS on 0.10, 0.12, and iojs. https://github.com/GeoffreyPlitt/iojs_tls_bug I'm going to try the partial fix above and see if it fixes my tests. |
It seems like these lines are partly responsible for this problem: https://github.com/iojs/io.js/blob/b27931b0fedc5fad638d637525aba84b2754ed5f/lib/_tls_wrap.js#L290-L305 |
Found a JSStream leak, but so far don't have much progress on the main issue. It seems that cross-reference between StreamWrap and TLSWrap creates some non-collectible cycle, which should not be really possible. Most likely some bug in my code, but I can't say what exactly it is yet. Will continue tomorrow. |
Any particular reason that existence check/apply in the proxied methods runs on every call? Can it not be:
|
@Qard no reason, all of these methods are usually called once per handle lifetime. |
Looks like my patch is fixing the issue. |
Should be fixed now, please confirm! ;) |
This bug has been causing me a lot of issues. I'll certainly build with this change and let you know how goes the memory usage. |
https://iojs.org/download/nightly/v1.4.4-nightly20150306dee07e2983/ Nightly with this change is there, please test, would love to have some feedback by the time 1.5.0 goes out later today. |
Deployed the nightly. Will beat on it for a couple of hours and report back. (fwiw, I know @aredridel reran her benchmark and saw the issue resolved as well.) |
I can confirm that for me, my memory usage is certainly much more stable. Before I would climb to several hundred megs with no problems. I look forward to there being a full blown release. |
Really early observations on the 1.4.4-nightly suggest there may be more. @aredridel is profiling it now. blue: 0.10, purple: 0.12, green: 1.4.4-nightly |
Without valgrind, it seems to have leveled things off somewhat. However, it seems to grow still once I take a deeper look. Here's the larger chunks of valgrind's output:
|
(ignore the v1.4.2 in the paths -- the actual binary being run is compiled from the pr/1078 branch) |
@aredridel this is bad news, as it could be related to https://code.google.com/p/v8/issues/detail?id=3949&thanks=3949&ts=1425654431 which I was experiencing with JSStream during debugging. Being not able to reproduce this issue myself, I've pushed suggested patch to the: https://github.com/indutny/io.js/tree/test/paypal-leak . May I ask you to give it a try? Thanks a lot! |
@aredridel It's a crude hack but can you try this patch? If it keeps running, great. If it aborts, there's still a handle leak somewhere. |
Building now! The actual instrumentation I'm using is the |
I'm diagnosing a problem when I run paypal's npm proxy https://github.com/krakenjs/kappa when run on iojs. After 300 requests for a 50kb tarball, memory allocated grows from 250 mb base to 450 mb.
I've run it under valgrind, and there's some things that look a bit suspicious around tlswrap, smalloc and some other crypto related bits. The interesting bits of the valgrind report are:
Any help or suggestions for what to look at to track this down would be wonderful.
Node 0.12 and 0.10 don't show this behavior with the same code (we're stress-testing io.js in parallel to shake stuff like this loose)
The text was updated successfully, but these errors were encountered: