Skip to content

contributing

Congjian Wang - INL edited this page Jun 2, 2021 · 51 revisions

Bug and Development Reporting

If you are a user and:

  • if you have detected an anomalous behavior of RAVEN (a bug), or,
  • if you have the need of a new feature added to RAVEN

feel free to:

  • contact us using the RAVEN user mailing list (inl-raven-users@googlegroups.com)
  • create a new issue in the webpage: https://github.com/idaholab/raven/issues. If the issue is not assigned to a specific developer, please mention it to @wangcj05 @PaulTalbot-INL or @mandd in the issue description. If possible, include: input files, output files and version of the libraries so we can recreate and fix the issue more quickly.

External Contribution to RAVEN code

General Contribution Workflow

The contribution of external users/developer is welcome. In order to contribute to the RAVEN code, follow the steps below:

  • Open an ISSUE (here) explaining the development and/or feature
  • Create a branch in your own FORK of RAVEN
  • Perform the development (here-dev_guide)
  • Open a Pull Request (here) and wait for a RAVEN developer to review it

It is important to notice that an external development will be accepted if:

  • The RAVEN team agrees that such development should be placed in RAVEN mainstream (wait for a developer to review your ISSUE before beginning the development).
  • If the development follows the RAVEN standards (e.g. coding standards, check lists, documentation). More information about development standards can be found here-dev_guide and here-check_lists.

New Code Interface

A typical external developers' contribution is represented by the coupling of RAVEN with an external Code. The procedure to couple a new code is detailed in:

  • the user manual, located in ./doc/user_manual/raven_user_manual.pdf, and
  • the workshop presentation that can be found in ./doc/workshop/codeCoupling/code_coupling.pptx.

Note (1): Only Code Interfaces that inherit from CodeInterfaceBase base class (i.e. class NewCodeInterface(CodeInterfaceBase)) will be accepted since the base class CodeInterfaceBase allows RAVEN to test the mechanics of the Interface without the need of an executable (see below).

Note (2): The Pull Request for a new Code Interface must reference to the issue #611 for tracking purpose.

Once the development of a new Code interface is finalized, two requirements need to be satisfied before the contribution can be pushed to RAVEN main repository:

  • User Manual Update
  • Addition of test cases
  • Read, Sign and Return (to D. Mandelli and A. Baker) the INL Open Source Contributor License Agreement (downloadable here)

In the following subsections, these two requirements are further discussed.

User Manual Update

The RAVEN user manual needs to be updated, adding a new chapter under the section Existing Interfaces explaining the usage of the new code interface. The new section must contain (at least) the following subsections:

  1. General Information: General description of the newly developed interface (e.g. usage, possible limitations, etc.)
  2. Models: Section explaining how to invoke the new code interface, with a clear description of all the possible additional XML nodes that the new code interface requires to be run correctly (e.g. special keywords, initialization options, etc.)
  3. Files: Section explaining which input files might be required and the way to list them. For example, allowed input files, input files' extensions, special types (e.g. input.inp )
  4. Samplers/Optimizers: Section explaining the syntax of the variables' naming conventions used to "inform" the code interface on how to perturb the input files (e.g. )
  5. Output Files conversion: Section explaining the way the output files of the driven code are converted into the RAVEN-compatible CSV file (e.g. variable naming conventions, types of outputs that are exported, etc.)

Addition of test cases

A set of tests (at least 1) to test the new code interface needs to be created. The tests need to cover all the mechanics present in the newly developed Code interface, such as:

  1. input parsing,
  2. output parsing and conversion into RAVEN-compatible CSV file and
  3. all possible options that Code interface allows (e.g. different calculation types of the driven code, different optional outputs, etc.).

The procedure to add a new test is as follows:

  • Navigate to directory ./raven/tests/framework/CodeInterfaceTests/
  • Create a new folder named as the name of the newly developed Code Interface (e.g. RELAP5 if the class of the code interface is named RELAP5 (class RELAP5(CodeInterfaceBase):)
  • Create a text file named tests in the just-created directory. In this file, the tests' specs are going to be inputted. An example and explanation of the most common parameters is reported in the image below:
[Tests]
 [./test_1]
  type = 'RavenFramework'
  input = 'raven_input_file.xml'
  output = 'path/to/output/out_1.out path/to/output/out_2.out'
  csv = 'path/to/ordered/csv/csv_1.csv path/to/ordered/csv/csv_2.csv'
  UnorderedCsv = 'path/to/unordered/csv/u_csv_1.csv path/to/unordered/csv/u_csv_2.csv'
  xml = 'path/to/ordered/xml/xml_1.xml path/to/ordered/csv/xml_2.xml'
  UnorderedXml = 'path/to/unordered/xml/u_xml_1.xml path/to/unordered/xml/u_xml_2.xml'
  text = 'path/to/a/text/file/a_text.inp path/to/a/text/file/a_text.out'
  max_time = 500
  rel_err = 1e-5
  test_interface_only = True 
 [../]
 [./test_2]
  ...
 [../]
 ...
[]
Tag name Required Default Description
type TRUE None String, type of test: In case of RAVEN, this tag is equal to "RavenFramework"
input TRUE None String, test input file: RAVEN input file
test_interface_only FALSE FALSE Bool, interface check: Used to inform RAVEN that an external code executable is not present and to assume that the outputs from a specific Step are already present in the working directory. Used for all Code Interface tests.
output* FALSE None Space separated value, list of strings: List of output files that the tester is going to look for (and error out if not found)
csv* FALSE None Space separated value, list of strings: List of output files (CSV) that the tester is going to compare with a reference solution in the gold directory. The values need to match row by row and column by column.
UnorderedCsv* FALSE None Space separated value, list of strings: List of output files (CSV) that the tester is going to compare with a reference solution in the gold directory. The values will be reordered so different order of the data is acceptable.
xml* FALSE None Space separated value, list of strings: List of output files (XML) that the tester is going to compare with a reference solution in the gold directory. The values in each XML node need to match entry by entry.
UnorderedXml* FALSE None Space separated value, list of strings: List of output files (XML) that the tester is going to compare with a reference solution in the gold directory. The values in each XML node will be reordered before comparing.
text* FALSE None Space separated value, list of strings: List of text files that the tester is going to compare with a reference solution in the gold directory. The files are going to be compared as plain DIFF.
rel_err FALSE 1.00E-08 Float, relative error: tolerated relative error between reference solution in the gold directory and the generated output files
max_time FALSE 500 Float, max time: maximum time of the test to run. If the test takes longer, it gets automatically tagged as failed

* At least one of these tags must be present

     In order to create a test for a Code interface (without the need to have an executable), the tag test_interface_only must be set to      True. In addition, any file that needs to be compared by any of the DIFFER reported above needs to be placed in ./raven/tests/       framework/CodeInterfaceTests/**NewInterfaceFolder**/gold/**path/to/output/file***.

  • Add the output files of the driven code in the right directory:  The only difference between a test with or without the executable of the driven code is represented by the presence of the output in the working directory.  
    • For example, assume that:
      • the test test_my_interface.xml is located in the directory ./raven/tests/framework/CodeInterfaceTests/MyInterfaceTest,
      • the <workingDir> (in test_my_interface.xml) is set to MyFirstTest,
      • the <Step> is named sampleMyCode and it generates 2 realizations (samples) of the driven model, whose outputs are named my_output.out
      • test_my_interface.xml has a CSV OutStreams (e.g. <OutStreams> <Print name="out_streams_RAVEN"> (...) </Print> </OutStreams> )
    • Consequentially, the output my_output.out for the first realization must be added in ./raven/tests/framework/CodeInterfaceTests/MyInterfaceTest/MyFirstTest/sampleMyCode/1/ and for the second in ./raven/tests/framework/CodeInterfaceTests/MyInterfaceTest/MyFirstTest/sampleMyCode/2/.
    • And finally, the tests file in the directory ./raven/tests/framework/CodeInterfaceTests/MyInterfaceTest would look like the following:
[Tests]
 [./test_1]
  type = 'RavenFramework'
  input = 'test_my_interface.xml'
  output = 'MyFirstTest/sampleMyCode/1/my_output.csv MyFirstTest/sampleMyCode/2/my_output.csv'
  UnorderedCsv = 'MyFirstTest/out_streams_RAVEN.csv'
  text = 'MyFirstTest/sampleMyCode/1/my_input.inp'
  max_time = 500
  rel_err = 1e-5
  test_interface_only = True 
 [../]
[]

  The image below shows an example for one of the Codes RAVEN is currently coupled to (SCALE)

  • Modify all the test files (e.g. test_my_interface.xml) and add the <TestInfo> XML node within the <Simulation> block:
<Simulation>
  ...
  <TestInfo>
    <name>framework/path/to/test/label</name>
    <author>AuthorGitHubTag</author>
    <created>YYYY-MM-DD</created>
    <classesTested>Module.Class, Module.Class</classesTested>
    <description>
        Paragraph describing workflows, modules, classes, entities, et cetera, how they are tested, and any other notes
    </description>
    <requirements>RequirementsLabel</requirements>
    <analytic>paragraph description of analytic test</analytic>
    ...
  </TestInfo>
  ...
</Simulation>

The <requirements> and <analytic> nodes are optional, for those tests who satisfy an NQA design requirement and/or have an analytic solution documented in the analytic tests document. Other notes on block contents:

  • name: this is the test framework path, as well as the name/label assigned in the tests file block. This is the path and name that show up when running the tests using the testing harness (run_tests). For example, if the test is named, in tests file, "test_1" and located in ./raven/tests/framework/CodeInterfaceTests/MyInterfaceTest/, the <name> must be set to framework/CodeInterfaceTests/MyInterfaceTest.test_1
  • author: this is the GitHub tag of the author who constructed this test originally, i.e. alfoa for @alfoa.
  • created: this is the date on which the test was originally created, in year-month-day YYYY-MM-DD XSD date format.
  • classesTested: a list of the classes tested in the python framework, listed as Entity.Class, i.e. Models.Code.MyInterfaceClass.
  • description: general notes about what workflows or other methods are tested.
  • requirements: (optional) lists the NQA requirement that this test satisfies.
  • analytic: (optional) describes the analytic nature of this test and how it is documented in the analytic tests documentation.

An additional node is optionally available to demonstrate significant revisions to a test:

<Simulation>
  ...
  <TestInfo>
    ...
    <revisions>
      <revision author="AuthorGitHubTag" date="YYYY-MM-DD">paragraph description of revision</revision>
    </revisions>
    ...
  </TestInfo>
  ...
</Simulation>
  • Now you are ready to check if the just-created test(s) are working fine in our regression test suite (navigate to the raven root folder (e.g. /home/username/projects/raven/):
    • Execute the command ./run_tests -j8 and check if your tests get run among all the other tests or,
    • Execute the command ./run_tests --re=framework/CodeInterfaceTests/MyInterfaceTest and check if your test(s) are working as expected