-
Notifications
You must be signed in to change notification settings - Fork 13.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GTest functional tests that include parameters and uORB messaging #12521
Conversation
@jkflying Given you're in the guts of this, could you check that https://dev.px4.io/master/en/test_and_ci/unit_tests.html is accurate and insert docs that show the essential things you had to do to get the uORB/param stuff working? (I am fairly sure this is not up to date, because it does not even mention any limitation w.r.t. uorb/params). What I'd really like is a template test that might live against this template application module example created by @bkueng. Thoughts? |
@hamishwillee there have been some improvements by @MaEtUgR since that was written, with tests now being able to be added in the library folders and using the more industry-standard GTest framework. There are also definitely some issues in the documentation there, eg Running Also those tests are very fine-grained. The tests this PR enables will make sure that entire systems are working correctly together, that they read their uORB messages and parameters correctly, and we can even check that the messages published by one library can be read by another library. Right now this is still in draft. I will update docs when it is complete 👍 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess you know my opinion. I don't think the testing should be at this layer. To me it looks (and probably is) too complex. I would prefer the testing to be on a lower level around the actual "business logic" before stuff is put in a module.
I agree there are a lot of dependencies that get brought in, and it would be nice to have more self-contained tests, but it seems to me that the actual API used for most things is parameters and uORB messages. Those are what ultimately define the behavior. So, if we really want to do functional testing in a nice easy-to-read style of
which is generally how business logic tests should work, we need to be doing this at the uORB and parameter layer. Otherwise we need to wrap everything with an extra layer just to test it, and there is a big swathe of code (the translation from the uORB API to the C/C++ API) that doesn't get tested, which is very prone to bugs due to eg. different naming in messages and C/C++ API. TLDR: I also think this is messy, but it's the only way I see to not have gaps in our testing with a pub-sub architecture. |
Thank you (re Re #12521 (comment)) If at all possible, please do consider adding a test module alongside Beat's example module. We're hoping more people take that and copy it as "best practise" so having good test alongside it makes sense. If not, then perhaps minimal would still be to link to the tests you think work well. |
The problem with this testing is that it's harder to run on NuttX, and it conflicts with ntest which would be very minimal and simple/stupid. |
In my view, gtest running on the build computer and ntest running on the device serve very different purposes. We need them both. With the amount of functionality that the firmware has, I would expect to need several thousand tests to get basic coverage of the critical components (navigator, commander, flight tasks, helper libraries and maybe some select drivers). We need to design for where a single developer can easily write 20+ tests a day, to achieve this. So, the testing environment needs to be as flexible as possible so that we don't need to do much refactoring while writing tests. The tests also need to be fast to run, and every contributor needs to be able to run all of them locally, quickly (in a few seconds), to make sure nothing was broken. Given this, I don't think adding the constraint of running them all on embedded hardware is going to be feasible. It will be slow, and we'll quickly hit the 2MB flash limit. On the other hand, most of the 'business logic' can be tested very easily on faster hardware which has more storage, with the basic assumption that the problems we are trying to prevent are due to bugs in the code itself, not things which are different between the build system and the device. What needs to run on the device itself (eg. with ntest) are the tests which make sure the hardware platform is as expected, so that the above assumption holds true. Testing the POSIX interface. Testing the driver interfaces. Testing threading issues and that network and serialization code works with big and little endian environments correctly. This way we can confidently transfer the results of the much larger suite of tests running elsewhere to the embedded platform. |
@julianoes Ok, so instead of changing the existing unit tests too much I instead decided to introduce a new test class, functional tests. This way the tests which don't have dependencies on parameters or uorb can still be used with @hamishwillee I added a minimal example in the |
I agree with that!
Ok sounds like a good trade off. |
18b81e8
to
b5b0fa7
Compare
b5b0fa7
to
324f8ac
Compare
535bbce
to
97919cc
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, this is truly awesome!
Could you rebase quickly and push? Thanks! |
97919cc
to
b352f39
Compare
…4#12521) * Add kdevelop to gitignore * Add test stubs * Rename px4_add_gtest to px4_add_unit_gtest * Add infrastructure to run functional tests * Add example tests with parameters and uorb messages * Fix memory issues in destructors in uORB manager and CDev * Add a more real-world test of the collision prevention
Describe problem solved by the proposed pull request
Right now our unit tests only work for the limited subset of code that doesn't depend on any parameters or uORB messages. This adds an example of building tests that depend on these.
Test data / coverage
Just some simple tests so far. Writing and reading parameters, sending and receiving uORB messages. The more complex test gets data into the CollisionPrevention library via parameters and uORB messages, which then constrains a setpoint. This is more proof of concept than trying to get good test coverage of the library.
Additional context
Having to list all of the dependencies again to get the stubs to be last is a pain. I'd love for there to be better tooling for that somehow.
This should enable full-coverage testing of the uORB and parameter API which most libraries use.