Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gatsby develop refresh endpoint memory leak #25868

Closed
apaniel opened this issue Jul 20, 2020 · 13 comments
Closed

Gatsby develop refresh endpoint memory leak #25868

apaniel opened this issue Jul 20, 2020 · 13 comments
Assignees
Labels
topic: source-contentful Related to Gatsby's integration with Contentful type: bug An issue or pull request relating to a bug in Gatsby

Comments

@apaniel
Copy link

apaniel commented Jul 20, 2020

Description

Each time the refresh endpoint is hit, memory increases a little bit and it doesn't go down, eventually leading to a heap out of memory after multiple refreshes.

develop process memory previous to a refresh:
image

develop process memory after a refresh:
image

and hitting more times refresh it just goes higher.

adding --max_old_space_size to a high value just delays the out of memory.

Steps to reproduce

gatsby develop
hit refresh endpoint
wait until refresh finishes
hit refresh endpoint
...

Expected result

Memory should not increase without getting back to normal over time.

Actual result

Eventually there is a memory leak

Environment

System:
OS: Windows 10
CPU: (12) x64 Intel(R) Xeon(R) CPU E5-1650 v4 @ 3.60GHz
Binaries:
Yarn: 1.21.0 - C:\Users\apa\AppData\Roaming\npm\yarn.CMD
npm: 6.14.4 - C:\Program Files\nodejs\npm.CMD
Languages:
Python: 3.8.3 - /c/Python38/python
Browsers:
Edge: 44.17763.831.0
npmPackages:
gatsby: 2.24.2 => 2.24.2
gatsby-graphiql-explorer: ^0.2.31 => 0.2.31
gatsby-plugin-react-decorators: 0.0.4 => 0.0.4
gatsby-plugin-react-helmet: ^3.1.18 => 3.1.18
gatsby-plugin-sass: ^2.1.26 => 2.1.26
gatsby-plugin-sharp: ^2.3.9 => 2.3.9
gatsby-plugin-typescript: ^2.1.23 => 2.1.23
gatsby-source-contentful: ^2.3.15 => 2.3.15
gatsby-source-filesystem: ^2.1.42 => 2.1.42
gatsby-transformer-json: ^2.2.22 => 2.2.22
gatsby-transformer-sharp: ^2.3.9 => 2.3.9

@apaniel apaniel added the type: bug An issue or pull request relating to a bug in Gatsby label Jul 20, 2020
@gatsbot gatsbot bot added the status: triage needed Issue or pull request that need to be triaged and assigned to a reviewer label Jul 20, 2020
@madalynrose
Copy link
Contributor

Hi @apaniel90vp!

Sorry to hear you're running into an issue. To help us best begin debugging the underlying cause, it is incredibly helpful if you're able to create a minimal reproduction either in a public repo or on something like codesandbox. This is a simplified example of the issue that makes it clear and obvious what the issue is and how we can begin to debug it.

If you're up for it, we'd very much appreciate if you could provide a minimal reproduction and we'll be able to take another look.

Thanks for using Gatsby! 💜

@madalynrose madalynrose added status: needs reproduction This issue needs a simplified reproduction of the bug for further troubleshooting. and removed status: triage needed Issue or pull request that need to be triaged and assigned to a reviewer labels Jul 21, 2020
@wardpeet
Copy link
Contributor

@apaniel90vp It's more likely an issue with gatsby-source-contentful than with gatsby refresh endpoint. Do you see the same behaviour without contentful?

@apaniel
Copy link
Author

apaniel commented Jul 22, 2020

@wardpeet that is a very good point, probably without contentful plugin it doesn't happen, as I can see the highest memory peak while generating the schema. I won't be able to check this today, but I will do tomorrow! Thanks!

@axe312ger
Copy link
Collaborator

I can confirm that I saw this happening as well.

Any help to identify where it is coming from is very welcome :)

@freiksenet freiksenet removed the status: needs reproduction This issue needs a simplified reproduction of the bug for further troubleshooting. label Aug 12, 2020
@AnalogMemory
Copy link
Contributor

AnalogMemory commented Aug 26, 2020

Has anyone identified the root issues with this? I've been having issues this week with memory. Nothing major has changed, so the sudden error is driving me crazy :/

  • Having multiple Contentful sources (2x) runs out of memory (removing one stops issue)
  • verbose mode says ~6000 nodes are created (so not too crazy)

When running gatsby build with more memory it stalls at building schema

  • node --max-old-space-size=5120 node_modules/.bin/gatsby build --verbose
  • Adding more makes no difference

Not sure what the best way to debug it further.

...

in further debugging. Removed all plugins except two entries for the contentful sources. Current error...


<--- Last few GCs --->

[79604:0x102d59000]   175063 ms: Mark-sweep 2044.9 (2052.8) -> 2044.0 (2057.3) MB, 147.2 / 0.0 ms  (average mu = 0.115, current mu = 0.048) allocation failure scavenge might not succeed
[79604:0x102d59000]   175204 ms: Mark-sweep 2045.2 (2057.3) -> 2044.4 (2051.3) MB, 44.6 / 0.0 ms  (+ 70.5 ms in 15 steps since start of marking, biggest step 39.2 ms, walltime since start of marking 142 ms) (average mu = 0.149, current mu = 0.187) finaliz

<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x271753bc08d1 <JSObject>
    0: builtin exit frame: stringify(this=0x271753bdee79 <Object map = 0x271797183639>,0x271726c004b1 <undefined>,0x2717325c1cc9 <JSFunction (sfi = 0x271740416209)>,0x271776fd3c29 <Object map = 0x2717971a82a9>,0x271753bdee79 <Object map = 0x271797183639>)

    1: stringify [0x2717b4275081] [/Users/am/Work/mm/node_modules/json-stringify-safe/stringify.js:~4] [pc=0x209a57f2ceaa](this=...

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x1011c96b5 node::Abort() (.cold.1) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
 2: 0x10009cae9 node::Abort() [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
 3: 0x10009cc4f node::OnFatalError(char const*, char const*) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
 4: 0x1001ddbc7 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
 5: 0x1001ddb67 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
 6: 0x100365a65 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
 7: 0x1003672da v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
 8: 0x100363d0c v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
 9: 0x100361b0e v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
10: 0x10036d9da v8::internal::Heap::AllocateRawWithLightRetry(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
11: 0x10036da61 v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
12: 0x10033d8eb v8::internal::Factory::NewRawTwoByteString(int, v8::internal::AllocationType) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
13: 0x100709c79 v8::internal::IncrementalStringBuilder::Extend() [/Users/am/.nvm/versions/node/v12.18.0/bin/node]
14: 0x10046e97a v8::internal::JsonStringifier::SerializeString(v8::internal::Handle<v8::internal::String>) [/Users/am/.nvm/versions/node/v12.18.0/bin/node]

@axe312ger
Copy link
Collaborator

@AnalogMemory do you use Rich Text? There is a bug in the current version of the source plugin which blows up your memory as soon you have Rich Text and some content referenced with a circular reference. The canary version at #25249 does fix this issue

@wardpeet wardpeet added the topic: source-contentful Related to Gatsby's integration with Contentful label Sep 9, 2020
@AnalogMemory
Copy link
Contributor

AnalogMemory commented Sep 9, 2020

@axe312ger Yeah I finally figured out it was a combo of

  • Contentful embedded hyperlinks in RichText fields linking to entries with RichText fields (and then back and forth, forever)
  • gatsby-contentful-source couldn't handle that many circular references it seemed

One of my content editors started using more of them just before it happened. Seems to be fine with a few, but when over a 100 entries were linking to each other it was killing the memory when gatsby-contentful-source was trying to build the queries.

I tried the gatsby-contentful-source@next version but it would have been more work to rewrite things and didn't want to push it out to production in that state.

Ended up finding a patch that @disintegrator posted and using that to hold me over till a proper release is ready :)
#24221 (comment)

I do have a task to test out the next version when I'm back from vacation next week

Thanks!

@axe312ger
Copy link
Collaborator

axe312ger commented Sep 16, 2020

@AnalogMemory Please continue discussion in #24221 as this ticket originally is about a memory leak bug in the browser while developing, not when sourcing the nodes on bootstrap.

@ImOnSmoko
Copy link

ImOnSmoko commented Sep 17, 2020

We're struggling with this issue as well. We are using the refresh endpoint to enable preview functionality for our content authors, so it is called frequently. Each call incrementally increases memory consumption until the application crashes.

Docker running node:12.18.3-alpine3.11

"gatsby": "^2.23.3",
"gatsby-image": "^2.2.42",
"gatsby-plugin-compile-es6-packages": "^2.1.0",
"gatsby-plugin-create-client-paths": "^2.1.22",
"gatsby-plugin-env-variables": "^1.0.1",
"gatsby-plugin-google-tagmanager": "^2.1.25",
"gatsby-plugin-manifest": "^2.2.42",
"gatsby-plugin-material-ui": "^2.1.9",
"gatsby-plugin-offline": "^3.0.35",
"gatsby-plugin-react-helmet": "^3.1.22",
"gatsby-plugin-react-svg": "^3.0.0",
"gatsby-plugin-robots-txt": "^1.5.0",
"gatsby-plugin-sharp": "^2.4.5",
"gatsby-plugin-sitemap": "^2.4.13",
"gatsby-plugin-svgr-loader": "^0.1.0",
"gatsby-plugin-typegen": "^1.1.2",
"gatsby-plugin-typescript": "^2.2.0",
"gatsby-plugin-web-font-loader": "^1.0.4",
"gatsby-source-contentful": "^2.3.35-next.63",
"gatsby-source-filesystem": "^2.1.48",
"gatsby-transformer-sharp": "^2.3.16",

@axe312ger
Copy link
Collaborator

Did anybody yet figure out what data is bloating the memory? Or which call is causing it? Any help in research is very much appreciated :)

@vladmiller
Copy link

I experience the same behaviour with WordPress using gatsby-source-graphq.

@vladar
Copy link
Contributor

vladar commented Oct 27, 2020

Can you try gatsby@2.24.89? A potential improvement was shipped in #27685 (won't fix it completely but hopefully will make the leak less offending)

@axe312ger
Copy link
Collaborator

This should be fixed since a while. If you still experience this issue, and use the latest version of gatsby-source-contentful, pls let us know

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
topic: source-contentful Related to Gatsby's integration with Contentful type: bug An issue or pull request relating to a bug in Gatsby
Projects
None yet
Development

No branches or pull requests

10 participants