Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove tests that often fail from nayduck #4362

Closed
bowenwang1996 opened this issue Jun 14, 2021 · 0 comments · Fixed by #4674
Closed

Remove tests that often fail from nayduck #4362

bowenwang1996 opened this issue Jun 14, 2021 · 0 comments · Fixed by #4674
Assignees
Labels
A-python-test Area: issues related to python tests C-housekeeping Category: Refactoring, cleanups, code quality Node Node team T-node Team: issues relevant to the node experience team

Comments

@bowenwang1996
Copy link
Collaborator

There are some nightly tests that fail most of the time and we should exclude them from the nayduck result to make it easier to see whether a regression has happened.

@bowenwang1996 bowenwang1996 added A-python-test Area: issues related to python tests C-housekeeping Category: Refactoring, cleanups, code quality labels Jun 14, 2021
@bowenwang1996 bowenwang1996 added the T-node Team: issues relevant to the node experience team label Jun 28, 2021
mina86 added a commit to mina86/nearcore that referenced this issue Aug 11, 2021
When check_pytest.py and check_nightly.py read the test list files
handle commented out include directives (i.e. `#./<path>` lines) so
that the include can be commented out with a TODO and the check
scripts will treat files listed in those files as being mentioned.

Issue: near#4362
Issue: near#4552
mina86 added a commit to mina86/nearcore that referenced this issue Aug 11, 2021
When check_pytest.py and check_nightly.py read the test list files
handle commented out include directives (i.e. `#./<path>` lines) so
that the include can be commented out with a TODO and the check
scripts will treat files listed in those files as being mentioned.

Issue: near#4362
Issue: near#4552
mina86 added a commit to mina86/nearcore that referenced this issue Aug 11, 2021
When check_pytest.py and check_nightly.py read the test list files
handle commented out include directives (i.e. `#./<path>` lines) so
that the include can be commented out with a TODO and the check
scripts will treat files listed in those files as being mentioned.

Issue: near#4362
Issue: near#4552
mina86 added a commit to mina86/nearcore that referenced this issue Aug 11, 2021
When check_pytest.py and check_nightly.py read the test list files
handle commented out include directives (i.e. `#./<path>` lines) so
that the include can be commented out with a TODO and the check
scripts will treat files listed in those files as being mentioned.

Issue: near#4362
Issue: near#4552
mina86 added a commit to mina86/nearcore that referenced this issue Aug 12, 2021
When check_pytest.py and check_nightly.py read the test list files
handle commented out include directives (i.e. `#./<path>` lines) so
that the include can be commented out with a TODO and the check
scripts will treat files listed in those files as being mentioned.

Issue: near#4362
Issue: near#4552
near-bulldozer bot pushed a commit that referenced this issue Aug 12, 2021
When check_pytest.py and check_nightly.py read the test list files
handle commented out include directives (i.e. `#./<path>` lines) so
that the include can be commented out with a TODO and the check
scripts will treat files listed in those files as being mentioned.

Issue: #4362
Issue: #4552
mina86 added a commit to mina86/nearcore that referenced this issue Aug 12, 2021
With NayDuck nightly runs always including some failing tests it’s hard to
uses it for checking whether a change introduced a regression.  Comment out
all the tests which consistently fail so that the nightly runs become
green.

Fixes: near#4362
Issue: near#4618
mina86 added a commit to mina86/nearcore that referenced this issue Aug 12, 2021
With NayDuck nightly runs always including some failing tests it’s hard to
uses it for checking whether a change introduced a regression.  Comment out
all the tests which consistently fail so that the nightly runs become
green.

Fixes: near#4362
Issue: near#4618
mina86 added a commit to mina86/nearcore that referenced this issue Aug 12, 2021
With NayDuck nightly runs always including some failing tests it’s hard to
uses it for checking whether a change introduced a regression.  Comment out
all the tests which consistently fail so that the nightly runs become
green.

Fixes: near#4362
Issue: near#4618
mina86 added a commit to mina86/nearcore that referenced this issue Aug 12, 2021
With NayDuck nightly runs always including some failing tests it’s hard to
uses it for checking whether a change introduced a regression.  Comment out
all the tests which consistently fail so that the nightly runs become
green.

Fixes: near#4362
Issue: near#4618
near-bulldozer bot pushed a commit that referenced this issue Aug 13, 2021
…4674)

With NayDuck nightly runs always including some failing tests it’s hard to
uses it for checking whether a change introduced a regression.  Comment out
all the tests which consistently fail so that the nightly runs become
green.

Fixes: #4362
Issue: #4618
@gmilescu gmilescu added the Node Node team label Oct 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-python-test Area: issues related to python tests C-housekeeping Category: Refactoring, cleanups, code quality Node Node team T-node Team: issues relevant to the node experience team
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants