Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert "Skip CLI tests on Windows until we resolve the blocking/hanging isuse. (#489)" #583

Open
wants to merge 6 commits into
base: rolling
Choose a base branch
from

Conversation

hidmic
Copy link
Contributor

@hidmic hidmic commented Jan 12, 2021

Hopefully ros2/launch#476 fixed this 🤞

CI up to ros2topic, ros2action, ros2interface, ros2lifecycle, ros2node, ros2pkg, and ros2service (repeated 5 times):

  • Linux Build Status
  • Linux-aarch64 Build Status
  • macOS Build Status
  • Windows Build Status

@hidmic hidmic force-pushed the hidmic/reenable-cli-tests-on-windows branch from 2f819bc to 4018dec Compare January 13, 2021 13:26
@hidmic
Copy link
Contributor Author

hidmic commented Jan 13, 2021

Rebased to solve failing DCO check.

@hidmic
Copy link
Contributor Author

hidmic commented Jan 13, 2021

Another round of CI up to ros2topic, ros2action, ros2interface, ros2lifecycle, ros2node, ros2pkg, and ros2service (repeated up to 5 times until failure):

  • Linux Build Status
  • Linux-aarch64 Build Status
  • macOS Build Status
  • Windows Build Status

Copy link
Member

@ivanpauno ivanpauno left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with green CI!

It's worth notifying everybody about these tests being reactivated, just in case they start hanging again.

@hidmic
Copy link
Contributor Author

hidmic commented Jan 14, 2021

Once/if Windows CI passes, I'll release launch into Rolling to get that Rpr job passing.

@hidmic
Copy link
Contributor Author

hidmic commented Jan 14, 2021

Hmm, tests don't hang anymore but they are as flaky as possible. I have to circle back.

@ivanpauno
Copy link
Member

This change could also be reverted here and here.

@hidmic
Copy link
Contributor Author

hidmic commented Feb 18, 2021

Oh, I missed those. Thanks @ivanpauno ! I'm still fighting tests to pass on Windows though.

…ng isuse. (#489)"

This reverts commit 8ad1208.

Signed-off-by: Michel Hidalgo <michel@ekumenlabs.com>
Signed-off-by: Michel Hidalgo <michel@ekumenlabs.com>
Signed-off-by: Michel Hidalgo <michel@ekumenlabs.com>
Signed-off-by: Michel Hidalgo <michel@ekumenlabs.com>
Signed-off-by: Michel Hidalgo <michel@ekumenlabs.com>
Signed-off-by: Michel Hidalgo <michel@ekumenlabs.com>
@hidmic hidmic force-pushed the hidmic/reenable-cli-tests-on-windows branch from a0e87ed to c0d8da2 Compare February 18, 2021 21:41
@hidmic
Copy link
Contributor Author

hidmic commented Feb 18, 2021

Alright, with last commits I can no longer reproduce flaky tests on Windows.

Spinning up CI up to ros2topic, ros2action, ros2interface, ros2lifecycle, ros2node, ros2pkg, and ros2service (repeated up to 5 times until failure):

  • Linux Build Status
  • Linux-aarch64 Build Status
  • macOS Build Status
  • Windows Build Status

@hidmic
Copy link
Contributor Author

hidmic commented Feb 19, 2021

Hmm, some failing ros2node tests need an increase in their timeouts, but ros2topic test failures are perplexing. I cannot reproduce them in a Windows VM.


As side note, I am a bit worried about the long timeouts. ros2 action info takes about 15 seconds to query the daemon and reply on Windows. It's not unusable, but it's not far from it either. FYI @clalancette.

@clalancette
Copy link
Contributor

As side note, I am a bit worried about the long timeouts. ros2 action info takes about 15 seconds to query the daemon and reply on Windows. It's not unusable, but it's not far from it either. FYI @clalancette.

Yeah, that's a problem. It's not useful for users, and the other problem is that adding these test look like they will add ~1 hour to our (already very long) CI times.

osrf/osrf_pycommon#66 will help somewhat . Other than that, we'd need to go in and do an analysis of what is taking so long.

@hidmic hidmic added the backlog label Mar 9, 2021
@hidmic
Copy link
Contributor Author

hidmic commented Mar 9, 2021

I'm proactively putting this in the Galactic backlog. Ideally, I think we should figure out why these CLIs are so slow on Windows before Galactic is out, but I don't know if that's realistic.

@hidmic hidmic removed the backlog label Mar 9, 2021
@hidmic
Copy link
Contributor Author

hidmic commented Apr 16, 2021

One thing I've noticed while doing manual source testing on Windows. CLIs are slow to respond when the daemon is up.

@hidmic
Copy link
Contributor Author

hidmic commented Jun 17, 2021

The issue with CLI daemons in Windows is tracked by #637.

@audrow audrow changed the base branch from master to rolling June 28, 2022 14:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants