Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: Persistent reference & Garbage Collector #985

Closed
overflowz opened this issue Apr 29, 2021 · 4 comments
Closed

Question: Persistent reference & Garbage Collector #985

overflowz opened this issue Apr 29, 2021 · 4 comments

Comments

@overflowz
Copy link

Before I explain, please do keep in mind that I recently started exploring the V8 & NAPI.

So the question, I have two objects (ObjectWrap) and one object depends on another one (passed via the argument). Because of this, I'm creating a Persistent reference to it.

So my problem is that whenever I'm releasing the reference to it (ref.Reset()) the object that was referenced gets garbage collected ~10 seconds later.

Is there any way to speed it up (i.e., collect it immediately), or am I doing something wrong? If so, what is the right way to make an object depend on another object?

Here's the PoC:

// xsocket.cc
#include "xsocket.h"
#include "context.h"

Napi::Object XSocket::Init(Napi::Env env, Napi::Object exports) {
  auto func = DefineClass(env, "XSocket", {});

  exports.Set("XSocket", func);
  return exports;
}

XSocket::XSocket(const Napi::CallbackInfo& info)
    : Napi::ObjectWrap<XSocket>(info) {

      auto options = info[0].As<Napi::Object>();
      if (options.Has("context")) {
        printf("got context option, creating reference\n");
        // here, creating a reference
        context_ref.Reset(options.Get("context").As<Napi::Object>(), 1);
      }
}

XSocket::~XSocket() {
  printf("XSocket: Destroy\n");
  context_ref.Reset();
}
// context.cc
#include "context.h"

Napi::Object Context::Init(Napi::Env env, Napi::Object exports) {
  auto func = DefineClass(env, "Context", {});

  exports.Set("Context", func);
  return exports;
}

Context::Context(const Napi::CallbackInfo& info)
    : Napi::ObjectWrap<Context>(info) {
}

Context::~Context() {
  printf("Context: Destroy\n");
}
// test.js
const lib = require('bindings')('gctest');

let context = new lib.Context();
new lib.XSocket({ context });

global.gc();
setInterval(() => {}, 1000); // but if I remove this line, then Context is freed immediately.

output:

XSocket: Destroy();
// and after ~10 seconds, while I'm expecting it to be called immediately.
Context: Destroy();

Thank you.

@KevinEady
Copy link
Contributor

Errmmm... try putting your JS code into something that would pop scope. This is just a guess: you are global.gc()'ing but the object created by the new lib.Context is still in scope... I dunno:

// test.js
const lib = require('bindings')('gctest');
{
let context = new lib.Context();
new lib.XSocket({ context });
}
global.gc();

@overflowz
Copy link
Author

@KevinEady Tried, it has the same behavior.

@mhdawson
Copy link
Member

mhdawson commented May 5, 2021

You can't depend on when an object will be collected. Once there is no longer a reference it's up to the gc to decide when/if it wants to free the object in order to reclaim space.

We do use things like global.gc() in tests because we want to be able to test out scenarios, but those are prone to break/fail as the gc evolves and for non-test code you really want to avoid anything which depends on the timing of when the gc will free objects (including avoiding the use of finalizers whenever possible).

Even if you go get it to do what you want now, it will quite likely stop doing want you want at some time later.

In terms of memory leaks what I generally do it try to validate that it will "never" get freed. If it's just a matter of time then it is "working as expected". If you can show that you will run out of memory (assuming you give the app a reasonable heap) before it gets freed that is one way to identify if it is a leak or just the gc not getting around to freeing the memory.

So for example you can run the code that may have a leak in a loop. If the heap size continuously grows until you get an OOM then you may have a leak. If it runs for a long time then most likely not. I often fix the heap at something like 128M to shorten the time before an OOM will occur (however, I'd avoid going too small as that often then just results in there not being enough memory for the app to run at all)..

@mhdawson
Copy link
Member

There has not been an further discussion for a month. I'm going to close this. Please let us know if you think that was not the right thing to do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants