Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test: skip reporting CanaryOnly failures for stable version tests #2698

Merged
merged 6 commits into from
Oct 24, 2024
Merged
7 changes: 7 additions & 0 deletions tools/deno/junit2json.ts
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,13 @@ function junitToJson(xmlData: { testsuites: JUnitTestSuites }): Array<TestSuite>
if (skippedTestsForFile?.some(({ name }) => name === testCase['@name'])) {
continue
}

// skip reporting on tests that even fail to deploy because they rely on experiments not available
// in currently tested version
if (testCase.failure?.includes('CanaryOnlyError')) {
continue
}
Comment on lines +121 to +125
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doing this here is a little funky because we report test results in 3(?) different places and this is just one of them. So now we'll have a mismatch between the Travis CI summary thing, the actual raw test output, and the e2e report built from this JSON (previously we also had a fourth which was the Slack notification).

Maybe this is fine for now, but I wonder if there's a more direct way to configure the test runner the same way Vercel's CI does? I would imagine it's some slight env var change we need to make or something 🤷🏼.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://github.com/vercel/next.js/blob/a8de8730d75625e3bd068787121561cf6ab5eaac/packages/next/src/server/config.ts#L244 setting this env var might prevent build failure and allow actual usage of those features on stable versions (so the deploys wouldn't be failing but features themselves very well might not work correctly yet for us), but I wasn't sure what can of worms I could open when playing with this env var so went with "skipping"

Overall - as we are producing report for users to see and we are generating it for stable version - I think it's fair to skip or ignore tests for features that can't be used on stable versions (without getting into "test mode") - the change here might be not enough overall - but I think direction is correct - skip/ignore them instead of trying to run them?

Those tests would be displayed when we run tests against canary because we wouldn't be hitting CanaryOnly error then

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In any case - this is something we can change, but we need something here and I think this is good enough at least for now until we figure out exactly how we should handle this?


const status = testCase.failure ? 'failed' : 'passed'
const test: TestCase = {
name: testCase['@name'],
Expand Down
Loading