Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault (core dumped) #317

Closed
6 tasks done
benmccann opened this issue Dec 26, 2021 · 20 comments
Closed
6 tasks done

Segmentation fault (core dumped) #317

benmccann opened this issue Dec 26, 2021 · 20 comments
Labels
help wanted Extra attention is needed

Comments

@benmccann
Copy link
Contributor

benmccann commented Dec 26, 2021

Describe the bug

I'm trying Vitest out for the first time by trying to switch vite-imagetools to use it. I ran vitest and got:

Segmentation fault (core dumped)
error Command failed with exit code 139.

Reproduction

git clone https://github.com/benmccann/imagetools.git
cd imagetools
git checkout vitest
yarn
cd packages/core
yarn test

System Info

command doesn't work because this project uses yarn. I'm hoping to send a PR to switch to pnpm, but that's sort of blocked on this issue

anyway, I'm using:

  • Vite 2.7.3
  • Vitest 0.0.113

Used Package Manager

yarn

Logs

>Segmentation fault (core dumped)
error Command failed with exit code 139.

Validations

@antfu antfu added the help wanted Extra attention is needed label Dec 26, 2021
@antfu
Copy link
Member

antfu commented Dec 26, 2021

Not sure if I can debug Segmentation fault, help wanted 😅

@userquin
Copy link
Member

@benmccann can you test with 0.0.117 version (should be fixed)?

@Demivan
Copy link
Member

Demivan commented Dec 28, 2021

Tried it locally - it still segfaults. It is not an Node error like #319 but an actual segfault.

@exbotanical
Copy link
Contributor

I messed around with this for a bit and can see that the tests are specifically failing when we call resolveConfig in generate-configs.spec.ts.

I don't know much about node internals, but it looks like something about the work being delegated causes the segfault. As far as vitest is concerned, the segfault happens once createWorkerPool is invoked.

I hope this helps :/

@GalassoLuca
Copy link

GalassoLuca commented Mar 3, 2022

I'm having the same issue of the #319 starting with the following error

FATAL ERROR: v8::FromJust Maybe value is Nothing.
...

It's always happening locally when running Vitest with --coverage option, while it's not happening when running it without this flag.

I said locally because it seems the first time in which I login to my mac it's working while after the first time it fails most of the time. , while in CircleCI it's happening just a few times (I think because each time a new VM is created from scratch). (The issue I had in CircleCI was because the cpus().length method wrongly return 35 instead of 4 – I assume that they intentionally modified because they want me to use their parallelization and so more VMs that I pay instead of using all the threads of a single VM)

Furthermore

  • since it's not predictable, it's difficult to reproduce and understand the exact the scenario
  • when disabling threads this issue disappears but the mocks fail to work.
  • when using both minThreads and maxThreads set to 1, the point (suite file logged in the console) that generates the error every time is different.

Notes

  • Vitest: 0.5.9
  • Node: 14.9.0
  • pnpm: 6.29.1

@GalassoLuca
Copy link

GalassoLuca commented Mar 4, 2022

BTW, I'm stress testing it and one time failed with

npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the trustlayer-api@1.10.15 test:ci script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR!     /Users/luca/.npm/_logs/2022-03-04T11_38_10_920Z-debug.log

And this is the log file
2022-03-04T11_38_10_920Z-debug.log

In the stress test, I discovered that for a suite of files this issue is not appearing at all, and so I assume that it's a suite that sometimes it's generating an internal error that is not caught properly.

@oneillsp96
Copy link

Does anyone have a line on this? @benmccann does it seem that Vitest cannot solve these segfault issues until Node addresses the VM issue on their side?

@staadecker
Copy link

staadecker commented Apr 27, 2022

I'm also running into this issue, would be nice to have a fix or a workaround!

@quinnsprouseaware
Copy link

I'm running into this issue as well 😞 Hope this gets fixed.

@sheremet-va
Copy link
Member

sheremet-va commented May 10, 2022

We made a change in the latest version that should help with that, can you check?

@richardmward
Copy link

richardmward commented May 10, 2022

* when using both `minThreads` and `maxThreads` set to `1`, the point (suite file logged in the console) that generates the error every time is different.

Prior to Vitest 0.12.3, the only way I can reliably run the tests is to set minThreads / maxThreads to 1. Am on Windows 10, Node 16.14 (although we get it on a Linux based CI box too).

Since updating to 0.12.3, I have had the issue once (the first time), but since then have had no issues fewer issues.

vitest watch worked 9/10 times (first one failed)
vitest run --coverage worked 5/10 times (last 5 failed)

From the experience I had with older releases of Vitest, once the error occurs, it does tend to keep occuring. I'm not sure at what point it starts working again, but it will do at some point!

For now I am more than happy with my workaround of using minThreads and maxThreads set to 1, but thought I'd add my experience up here in case it helps.

@GalassoLuca
Copy link

I'm further testing it but with Vitest 0.12.3, v8 FATAL ERROR is not occurring anymore.

This is the project I'm working on:

  Test Files  192 passed (192)
  Tests  719 passed | 2 todo (721)
  Time  214.79s (in thread 64.47s, 333.15%)

@benmccann
Copy link
Contributor Author

Upgrading to the latest did not help me, but rebasing against JonasKruckenberg/imagetools did fix it. I'm not exactly sure what fixed it, but I'm guessing it's because the repo switched from yarn to pnpm

That's good enough for me, so I'll close this issue. If other people are still encountering this I'd recommend filing a new issue with a repository that reproduces your error

@mikemaccana
Copy link

Hrm using 0.26.2 here and getting Segemntation Faults. I don't want to file a new issue because I don't have a repo to reproduce, but I can say the error happens more often the more test files I run in parallel - this is why I can't reproduce, as running tests individually works.

npx vitest run src/backend/*test.ts - seg fault
npx vitest run src/backend/[a-u]test.ts - seg fault most of the time
npx vitest run src/backend/[a-k]test.ts - works
npx vitest run src/backend/some-file.test.ts (with any file including the ones at the end of the alphabet) - works

@loxator
Copy link

loxator commented Feb 22, 2023

bumping as I am facing similar issue as @mikemaccana. If I disable threads, then tests run ok

@Christopher-Stevers
Copy link

I'm running into the same issue as well.

@rushil23
Copy link

Facing this issue as well - Re-opening an issue because this one's closed for some reason

@alice-was-here
Copy link

I'm having this same issue - it occurs in some of our ci/cd tests, but not when I run the same suite of tests locally.

Originally running version v0.26.2, bumped to v0.30.1 with the same issue.

@colinparsonscom
Copy link

Dealing with the same issue here too!

@Max101
Copy link

Max101 commented May 10, 2023

i have the same issue, anyone tested how to fix this?

@sheremet-va
Copy link
Member

If you have segmentation errors, disable threads.

With config:

export default {
  test: {
    threads: false,
  }
}

With CLI:

vitest --no-threads

@vitest-dev vitest-dev locked as resolved and limited conversation to collaborators May 10, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests