-
Notifications
You must be signed in to change notification settings - Fork 60
Concept on how to deliver test cases for contributed Camara APIs #61
Comments
I can think of: conformance tests.These validate the request and response syntax of an API, including response codes and exceptions. documentation conformance tests.Could be by parsing the .md to ensure the markdown template is followed penetration tests.probably out of scope for CAMARA but essential for implementers performance testsout of scope privacy by design and security by design.To be discussed, could be a manual review? user acceptance testingIn this case the user is the developer. Ideally API developers (and not the API authors) will review. Others? |
Use suggestions and feedback provided here. Contribute the first set of test cases for one of the subprojects and for one or more of the categories listed above and then derive a (guidelines) doc out of this contribution |
We propose to use Gherkin as DSL for ATP. This DSL allows (via other tools, up to operators' criteria) automation on test campaign execution. An example of baseline ATP using Gherkin is provided in PR #102. @Kevsy: in relation to the tests you have listed above, here's our feedback.
|
Thanks @jordonezlucena For 'user acceptance testing': the 'user' here is the API consumer. So UAT is to make sure the API is useful and useable - otherwise we have no feedback and end up publishing the API and hoping it will be used. An example could be that the network operators define an API using certain operations, datatypes, error messages etc. that cause problems to the API consumer. In which case the UAT feedback would be used to reconsider the API defintion. I see UAT as an iterative process (before and after the conformance test) with the result that we have a useful/conformant spec. |
@jordonezlucena : Proposal to use Gherkin as DSL is fine from DT side. |
Thanks both for the feedback. In the meantime, my take is to draft a (live) table where we specify: 1) DSL for testing, and 2) tests in scope and out of scope in CAMARA. Does it make sense? |
Hi @ALL, |
Looking forward to it. |
The pull-request is here: #148 You can look at it. |
Cucumber Framework is based on Gherkin ,we are sharing samples for QoD cucumber Tests . |
@mdomale, For me the question is not: "what is the best framework to test my implementation by myself?" but "which framework makes it easy for someone else to test their implementation for conformance using the suite I provide?" I think that Cucumber has a few downsides that you didn't mention in your comparison:
A Karate collection can be run in vscode without any knowledge of the programming language and can also mock NEF. Maybe I'm trying to answer the wrong question, let me know if that's the case. |
@patrice-conil - Thanks for your PR and @mdomale - Thanks for your references. Both are useful read. There are few comments from my side.
|
@sfnuser: As test artifacts are expected to be in the main subproject repos, there is at least a desire to align on the framework/tools to be used for the same (and yes across subprojects). But for the main subproject repos, it would be useful to have a common alignment. let us know your view. |
@sfnuser, Are we targeting validation for API consumers or API providers or both? |
@shilpa-padgaonkar I believe Provider Implementations can add their own set of tests that suit their need, within their repo. I don't think we need commonalities across PIs on this aspect. @patrice-conil
Good question. I think it is nice to have both. However, if we have to choose one at this point, API providers would be my pick. It would put the onus of all of us to consistently upgrade and maintain :-) and API consumer devs are free to choose PIs. |
@patrice-conil
|
@mdomale,
|
Although we don't want to restrict usage of different tools to ensure maximum flexibility we would like to prefer below stack
|
@patrice-conil @sfnuser Can you please share confirmation that you are fine with above stack or you have any concerns ? |
@mdomale, |
As @patrice-conil mentioned, if the instructions to dockerize the test suite is also provided, I have no further concerns. Cheers. |
@sfnuser @patrice-conil Thanks a lot for your responses .Yes we can provide docker/k8s image. @shilpa-padgaonkar @akoshunyadi |
@patrice-conil @sfnuser We have created a draft pull request with our initial contribution for QoD APIs and integration of wiremock will be done shortly . |
@patrice-conil @sfnuser @jlurien PR is already raised for review |
Providing API test cases is included as one of the mandatory items in the API Readiness minimum criteria checklist defined in Commonalities. These checklist steps are part of the requirements in order to generate a stable release v1.0.0 in CAMARA as it is being discussed in Commonalities #139 As @jordonezlucena mentioned above, we propose to use Gherkin as DSL for ATP. This DSL allows (via other tools, up to operators' criteria) to automate the execution of test campaigns. And perhaps specific test implementations using Cucumber (or other frameworks) could be part of the provider implementation repos, if applicable. @rartych @shilpa-padgaonkar @jordonezlucena @jlurien @patrice-conil @sfnuser @mdomale @hdamker To satisfy step 5 ("API test cases and documentation") of the API Readiness checklist, could we conclude that the mandatory requirement is to provide the `.feature' gherkin file describing the test scenarios and left implementation specifications to implementation repositories? |
@jpengar : thanks for resuming the issue. |
@jpengar and @jordonezlucena : This issue was already resolved with agreement from multiple participants (see comment #61 (comment) and #61 (comment) ). Based on this agreement, a PR was created in QoD subproject which was reviewed, approved and merged. I see now that @jpengar has requested 2 new things:
I would rather open a new issue where we extend the already made contribution with the Gherkin feature file. DT will then create a PR in QoD subproject to add the feature file.
At least for QoD all current 3 providers DT, Orange and SpryFoxNetworks are fine to have a common implementation as currently provided. If you are still keen to move this out, I would recommend you to start this as a separate discussion and try to get a new consensus here. @jpengar @jordonezlucena : Please provide feedback if you are ok with the recommendations above. |
@shilpa-padgaonkar In my personal defense (kidding) I will say that the issue was still open when I got to it and it was previously referenced only a few days ago... :) Anyway, what I wanted to know is what exactly is the requirement to fulfill step 5 ("API test cases and documentation") of the API Readiness checklist. And if it is good enough to provide the .feature file or not. As mentioned above, the .feature file is actually the gherkin file that describes the test scenarios. And this is not a new request, if you check the PR camaraproject/QualityOnDemand#134, the .feature file is provided as part of the pull request as the test case definition. But apart from that, the test plan implementation is also provided using the cucumber framework (test runner, test step implementation in java, etc). As for moving the test case implementation into provider implementation repos. For me it makes sense, but if it was agreed to include it in the main project, I would like to know if it is a mandatory requirement to satisfy the corresponding step in the API Readiness checklist or not. Or if a .feature file describing the test cases is good enough as a prerequisite to generate a first stable API version v1.0.0. |
@jpengar : :) no worries. I would say that the feature file would fulfill the requirements from the minimum readiness checklist. But this is just my personal opinion, and we could check in the group for feedback from others. |
Since Gherkin .feature is a high level description of a specific set of features/behaviours that the api implementation would provide as user-experience, me and @FabrizioMoggio are fine with adding it into related subproject (e.g. TrafficInfluence). In this way it will represent a useful document of the api's "internal" behaviour, which is hidden if you rely only on openapi specs, and it will be easier and faster to assess minimum critieria without being an expert in BDD or Gherkin. Consequently, the test case implementation (Cucumber), which is tightly coupled with the reference implementation (preconditions, states, mocks, etc.), will be hosted in the implementation repo. If we are understanding BDD correclty: firstly write .feature, then approve, then write implementation code. |
Thank you @jpengar @rafpas-tim @FabrizioMoggio for your feedback. Would like to propose the following:
Would this work? If it is still important for you that we move the QoD test implementation to the provider implementation repo, we can also do that. Would be fine for me both ways. |
It makes sense to me. Thank you @shilpa-padgaonkar |
@jpengar : Thanks for your feedback. @rafpas-tim @FabrizioMoggio : Could you kindly provide your feedback? If you are ok with the proposal, we can go ahead and close this issue. |
this is fine to me. |
In what format are the test cases expected?
The text was updated successfully, but these errors were encountered: