Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: CoAP release specs #73

Closed
miri64 opened this issue Oct 12, 2018 · 5 comments
Closed

RFC: CoAP release specs #73

miri64 opened this issue Oct 12, 2018 · 5 comments
Assignees

Comments

@miri64
Copy link
Member

miri64 commented Oct 12, 2018

With RIOT-OS/RIOT#7428 I was wondering if we could integrate CoAP into our release specifications (or actually astonished that we still don't include it). It is a very important protocol for the IoT and should thus be rigorously tested (IMHO tests with native should be enough though). We could use the plugtests for CoAP and CoAP-RD that were done at various IETF-related events as a template for that.

@kb2ma
Copy link
Member

kb2ma commented Oct 13, 2018

Sounds like a good idea to me. I actually have some functional tests for gcoap based on pyexpect that I run to verify no regressions with a new PR. Based on that, I would test the following, in order of importance:

  1. Client/Server non-confirmable GET -- the most basic request
  2. Client confirmable GET, including retries -- ensure 1 retry works and 4 retries fail, which triggers timeout handling
  3. Client/Server non-confirmable PUT/POST
  4. Observe registration and notifications (server only) -- (gcoap only) see the observe functional test
  5. Server Block1 reception (server only) -- (nanocoap only) maybe use the sha256 example

RIOT's RD support is client side, so that could be used to simultaneously to verify some of these items, as well as RD itself.

I just paged through the current tests. Is there any automation here, or are they run manually?

I could adapt/target my existing automated tests for the specs above. I would want to convert any non-RIOT endpoint code in the tests to use aiocoap. Currently they use a simple CoAP library I created. Also, the automated tests require pyexpect. Is this dependency acceptable?

I would be happy to run these tests for this release and document how they are run and verified as I go.

@miri64
Copy link
Member Author

miri64 commented Oct 13, 2018

This sounds good. AFAIK there is some automation for the tests around, sadly never submitted to the repo. @jia200x can you point to that? Experience tells me, that someone else running the tests might be beneficial for at least one release, so its non-biased (first time someone else than me ran the GNRC tests they were like "WTF?" ;-)).

@jia200x
Copy link
Member

jia200x commented Oct 15, 2018

With RIOT-OS/RIOT#7428 I was wondering if we could integrate CoAP into our release specifications

Makes 100% sense.

This sounds good. AFAIK there is some automation for the tests around, sadly never submitted to the repo. @jia200x can you point to that? Experience tells me, that someone else running the tests might be beneficial for at least one release, so its non-biased (first time someone else than me ran the GNRC tests they were like "WTF?" ;-)).

I have some scripts for running Release Specs tasks (via serial port or IoTLAB). I will push them sooni-sh :)

@kb2ma
Copy link
Member

kb2ma commented Nov 2, 2018

@jia200x, I reviewed the scripts in your testrunner branch. They are awesome! Really nice use of Python. I need something like this to move my functional test automation forward and hopefully into the RIOT tree. I'll also automate the tests in the 09-coap directory, but it won't be ready for this release.

@jia200x
Copy link
Member

jia200x commented Nov 6, 2018

@jia200x, I reviewed the scripts in your testrunner branch. They are awesome! Really nice use of Python. I need something like this to move my functional test automation forward and hopefully into the RIOT tree. I'll also automate the tests in the 09-coap directory, but it won't be ready for this release.

@kb2ma thank you so much! :) I based them on the testrunner of Django. They still need some work but I have always found this mechanism quite easy to expand.
Feel free to take whatever you need from the branch :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants