-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
👨🌾 launch_testing_ros
regressions wait_for_topic_launch_test
on linux
#304
Comments
Today it appeared without an exception, just the test failure, see: |
I'm able to reproduce both failures ( |
Do you think this test failure might be related to this issue? |
Today it appeared in the humble coverage job: https://ci.ros2.org/job/nightly_linux_humble_coverage/16/testReport/junit/launch_testing_ros.test.examples/wait_for_topic_launch_test/wait_for_topic_launch_test/ |
It also has been appearing often in the linux_coverage jobs: |
👨🌾 This error fails with a flaky rate of +- 7%. References from last month: |
Happened today in aarch64-debug: 2197
|
This issue increased is appearance on ros2 buildfarm during the last week considerably. Flakiness report 2023-02-21:Job: Rci__nightly-debug_ubuntu_jammy_amd64 Job: nightly_linux_coverage Job: Rci__nightly-release_ubuntu_jammy_amd64 Job: nightly_linux-rhel_debug Job: nightly_linux_debug Job: nightly_linux-rhel_repeated Job: nightly_linux_humble_coverage Job: nightly_linux_repeated Job: nightly_linux-aarch64_debug |
I think the reason for that is that there were 2 weeks in there where we basically weren't running any of the rclpy-based tests (this was due to a bug in colcon that has since been fixed). So it may not be something that actually increased, we just really weren't testing it much over the last month. Can you go back and compare with historical data? Like, what was the percentage failure in November and December 2022? |
Rci__nightly-debug_ubuntu_jammy_amd64: 6.56% In both November and December Yes, I think they are like the same as before |
Flakiness Report 2023-03-27
I think these percentages are higher than ever. Some jobs almost reached 50% flakiness. @clalancette Do you think we should disable this test? |
No, I don't think so. The ones that we actually mostly care about ( |
With #360 merged, I'm going to call this "fixed". But please reopen if this starts occuring again. |
A similar problem is happening in Humble Coverage at a 10% flaky ratio: Reference build: https://ci.ros2.org/job/nightly_linux_humble_coverage/594/ wait_for_topic_launch_test.TestFixture.test_topics_successful failed
I'm not sure if this is the same problem, or if it is a different but similar one |
This seems to be happening consistently in humble coverage jobs now, see: https://ci.ros2.org/job/nightly_linux_humble_coverage/944/ |
This is happening as a flaky issue Flakiness for the last 15 days:
Last failures:
|
Bug report
Required Info:
jammy
rclpy
Steps to reproduce issue
Run a buildfarm job on
nightly_linux_debug
Expected behavior
All tests pass.
Actual behavior
Various
launch_testing_ros
tests fail and throw exceptions.See: https://ci.ros2.org/view/nightly/job/nightly_linux_debug/2251/
Additional information
wait_for_topic_launch_test.TestFixture.test_topics_unsuccessful
shows:Error log
wait_for_topic_launch_test.TestFixture.test_topics_successful
shows:Error log
The text was updated successfully, but these errors were encountered: