-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[node][7.102.0] Suspicion of Memory Leak #10790
Comments
Hey @xr0master thanks for writing in! Unfortunately this is going to be quite hard to figure out without a reproducible example. Any chance you could provide one? Another question:
Are you creating multiple hubs on purpose? Or is this just something you observed with the single |
Hey @Lms24
It's definitely not easy... I'll try to do it.
Second option. Just regular integration with express.js. As far as I remember, only one hub should be created for the entire application, is it correct? |
@Lms24 npm run build
npm run dev get request Since the application is small, catching this issue is even more difficult. Dozens of requests with different intensities with several snapshots, and I managed to reproduce it. Good luck with this. |
@xr0master thanks for the repro and for all the details! Debugging such leaks is very hard but we'll try to get to the bottom of this. One question though: Did this only happen in the most recent version? Have you by any chance tried an older SDK version where this didn't happen before? |
We noticed a memory leak in our production environment and am pretty sure it might be the same issue. Just dumping some info in case any of this helps. We were on version @sentry/node 7.91.0, and upgraded to 7.104.0 during our routine package maintenance. We noticed the memory leak in production. I took 7 or so memory snapshots that i still have. When comparing the snapshots I noticed a constant increase in the Delta size, especially in strings and objects. Looking more into the string ones they all looked something like this: As xr0master, we are also running express and the same set up as in the config. Node 18 on elastic beanstalk. We did have the node/tracing package installed which i believe is deprecated. We did remove it, but I assume that wasn't what fixed it. We downgraded about 2 hours ago to 7.91.0 and memory has been normalized since. @Lms24 let me know if there is anything we can provide to assist. |
Same here: memory leak after upgrading Sentry from 7.84.0 to 7.102.0. Reverted back to 7.84 and we're OK again. Node v18.17.0, also noticed this in Node v20.11.1 |
Echoing this, we have upgraded from 6.2.2 to 7.101.1 and are encountering severe memory leaks with similar heap snapshots as those posted above! |
Fwiw, we don't use Sentry in our express config at all, only in our Vue application and also noticed this memory leak (after upgrading 7.84.0 to 7.102.0). We simply execute Sentry.init with some config options: ignoreErrors, environment, attachStacktrace (true), release, debug (false, unless on dev/test), tracesSampleRate set to 0, sampleRate set to a very low figure (as we have lots of traffic), allowUrls (regex with only our domain), and the Vue app itself (for the Sentry.Vue, for server side rendering / Sentry.Node we can't supply the app). Testing is quite time-consuming, but if you have a suspect version I'm happy to help with a loadtest on our application with a before/after. |
@daan-nu if you could provide heap snapshots or similar would be a huge help for us to debug further.
I recommend you don't use |
@sarunast did you check the order of our handlers as described in this comment above? |
As you said it's next.js so we don't configure it for express. We use very basic recommended next.js sentry setup so there is nothing much to add. There is one odd thing about the setup. We do not deploy via vercel rather our own custom docker'isd deployments to AWS. Due to this custom setup sentry works only for client and the backend reporting doesn't even work. Not sure if it helps it would be great to fix the backend reporting but I think it's not related to this issue? |
Apologies, I completely missed this in your first comment. Unfortunately, I think we're talking about multiple different memory leaks and scenarios by now. The only reproduction we received so far was for the Node SDK which no longer leaks memory if the order of registering the Sentry Node/Express handlers is corrected. Furthermore, users are mentioning different version ranges when the leak started happening. If you're able to provide a minimal reproduction for the leak you experience in your NextJS, we'd greatly appreciate it! |
Looking at https://github.com/getsentry/sentry-javascript/blob/develop/CHANGELOG.md#71000 The related changes are:
What Node version is everyone using? What SDK features is everyone using (errors, performance, profiling, etc.). For Next.js in particular are you using the app or pages router? Because of the variety of different permutations of possible setups, sharing your |
@AbhiPrasad I think it is this change where we introduce strong references to scopes on spans #10492 |
Considering we now have a new major version that made some improvements here ( https://docs.sentry.io/platforms/javascript/guides/node/migration/v7-to-v8/ Closing this issue for now as such. |
Is there an existing issue for this?
How do you use Sentry?
Sentry Saas (sentry.io)
Which SDK are you using?
@sentry/node
SDK Version
7.102.0
Framework Version
espress.js
Link to Sentry event
No response
SDK Setup
Steps to Reproduce
app.use(Sentry.Handlers.requestHandler());
Expected Result
n/a
Actual Result
I would like to point out that the problem is not one-to-one. That is, not every request gets stuck in memory. After dozens of requests, only one object can be added, and sometimes more. I didn't find any particular pattern.
You may notice that multiple hubs are created, and the data (in locals) is not removed from the response object (and the response object is not deleted either).
if remove
app.use(Sentry.Handlers.requestHandler());
, there are no memory issues.The text was updated successfully, but these errors were encountered: