-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test: skip reporting CanaryOnly failures for stable version tests #2698
Merged
+38
−22
Merged
Changes from 1 commit
Commits
Show all changes
6 commits
Select commit
Hold shift + click to select a range
afc762f
ci: decrease number of shards when running Vercel tests
pieh 960ef04
test: use PWD instead of hardcoded repo directory
pieh 645f220
test: collect build logs in Vercel e2e tests (like we do in our own e…
pieh 21bb071
test: skip esm-externals-false tests (they are just checking CLI output)
pieh 1f5c7f7
ci: skip reporting about tests that use canary only features when tes…
pieh 7faa876
test: fix typo when setting _cliOutput
pieh File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doing this here is a little funky because we report test results in 3(?) different places and this is just one of them. So now we'll have a mismatch between the Travis CI summary thing, the actual raw test output, and the e2e report built from this JSON (previously we also had a fourth which was the Slack notification).
Maybe this is fine for now, but I wonder if there's a more direct way to configure the test runner the same way Vercel's CI does? I would imagine it's some slight env var change we need to make or something 🤷🏼.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/vercel/next.js/blob/a8de8730d75625e3bd068787121561cf6ab5eaac/packages/next/src/server/config.ts#L244 setting this env var might prevent build failure and allow actual usage of those features on stable versions (so the deploys wouldn't be failing but features themselves very well might not work correctly yet for us), but I wasn't sure what can of worms I could open when playing with this env var so went with "skipping"
Overall - as we are producing report for users to see and we are generating it for stable version - I think it's fair to skip or ignore tests for features that can't be used on stable versions (without getting into "test mode") - the change here might be not enough overall - but I think direction is correct - skip/ignore them instead of trying to run them?
Those tests would be displayed when we run tests against canary because we wouldn't be hitting CanaryOnly error then
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In any case - this is something we can change, but we need something here and I think this is good enough at least for now until we figure out exactly how we should handle this?