From faf4cd5945c86a5c7ffcb9429431afb0fb64b17a Mon Sep 17 00:00:00 2001 From: Diego Mandelli Date: Thu, 18 Oct 2018 12:27:23 -0600 Subject: [PATCH] PRA plugin machine milestone adapted new data object (#626) * added test files * edits * test for expanded ET * Non-multilevel Optimizers reworked (#507) * optimizers passing with the new DataObjects except multilevel * Alfoa/data object rework (#509) * fixed CustomSampler * fixed test_output for new version of matplotlib * added files * edits * Alfoa/test fix (#525) * added _reassignSampledVarsPbToFullyCorrVars to Sampler base class. * Script now offers a flag to change ignored files (#508) -e can be followed by a comma separated list of directories or files which will be ignored. Old behavior is to only ignore .git. New default is .git and raven. * fixed run_tests * Framework in makefile (#520) * Added hit to all requirements and clean * Now bindings instead of binary * Moves hit.so to usable location * fixed another parsing error in tests file for framework/PostProcessors/TemporalDataMiningPostProcessor/Clustering.GaussianMixture * added new ID of moose with fix on TestHarness * fixed make file for windows * rework: test harness fix (#526) * fixed test harness to remove output files * Asynchronous History Sampling (#511) * fixed adding histories of different lengths through addRealization * fixed restart MC typo * Fix data mining for the new data object (#512) * edit * edits * initial fix for data mining pp * fix plot and pca * fix plot and HS2PS * fix temporal data mining pp * fix time-dep data mining based on dimensionality reduction * fix dataobjectfilter pp and datamining pp * regold tests, because label switches, and change of prints * addition regold * regold pca * fix comments * fix more comments * fix time-dep basic statistics pp and history set sync pp (#515) * fix time-dep basic statistics pp and history set sync pp * add regold files * fixed internal parallel test for PP (#521) * update user manual for several post-processors (#523) * update documents for basic statistics pp * update documents for metric and cross validation pp * update documents for ImportanceRank pp * update documents for external post-processor * update documents for data mining post processor * resolve comments * rework: raven running raven (#522) * Script now offers a flag to change ignored files (#508) -e can be followed by a comma separated list of directories or files which will be ignored. Old behavior is to only ignore .git. New default is .git and raven. * test added * improving analytic test doc * Generalized typechecking (#524) * isAType and unit tests * converted dataobjects to use mathutils typechecking * edits * regold historysetsyn tests (#529) * starting working on DET * Fix clustering with DTW and ETImporter PP (#531) * fix solution export in Temporal data mining * fix ETImporter and regold AffinityPropogation * fix dtw * External Model collection fix (#534) * fixed external model and added a complexity test * fixed external model to catch more variables * Alfoa/incorrect test files (#535) * fixed inccorect tests * fixed custom mode * removed unuseful stuff * added regold file * fixed AdaptiveBatch test files * fixed test_Custom_Sampler_DataObject.xml * fixes (#533) *Fix interface pp for risk measure * Fix xsd and PostProcess in raven tutorial (#536) * fix postprocess test in the user_guide * fix xsd * comment out the postprocess output * fixed gold file for getpot test (#539) * rework: Multilevel optimizers (#537) * fixed external model and added a complexity test * fixed external model to catch more variables * fixed pathing * fixed infinite missing file * optimizer tests working * rework: CSV fixes and improvements (#538) * loading from dict or CSV now extends existing data instead of replacing it * fixed multiple history csv loading * added failure check test * added files * edits * initial implementation of markov categorical distribution * rework: added libs to conda setup (#542) * added xarray, netcdf4 * trying to fix moosebuild failure by adding seperator keyword in csv reader in tester * edits * edits * Deep Learning from Scikit-Learn (#547) * add Multi-layer perceptron classifier and regressor into raven * add tests for multi-layer perceptron neural network * add manuals * modify input files to avoid regold * fix typos and update scikit-learn versions * rework: Raven-runs-raven with ROM (#548) * fixed time-dependent SKL ROM, added test * slight regold in 5th sigfig for raven runs raven case * rework: MOOSE submodule update (#550) * update to latest moose master, includes test harness failure improvements * compute the steady state probability * reverted QA verson of sklearn to 0.18 (#552) * edit * add gold files, fix parser error * update tests * add python module for external model * edits * edits * edits * fixed rrr example plugin test (#558) * fix library version for skl (#559) * edits * edits * edits * initial implement for DataClassifier postprocessor * update functions * update test * edits * update tests, fix typos, add gold files * added paper * add the capability to handle input with type of HistorySet * edits * add documents for DataClassifier pp * edit * edits * Hierarchical and DET for new DataObject (#532) * fixed a portion of enesemble model * added check for consistency for shape in indexes * added unit test for checking index/var shapes * fixed custom sampler * updated revision * addressed Hier issue * fixed Hierachical and DET tests (no adaptive yet) * skiped a portion of the unit test since it is not finished yet * fixed new modification in XSD * addressed Paul and Conjiang's comments * fixed MAAP5interfaceAHDETSampling and MAAP5interfaceADETSampling * fixed framework/user_guide/ravenTutorial.singleRunPlot, framework/user_guide/ravenTutorial.singleRunSubPlot, framework/user_guide/ravenTutorial.RomLoad * fixed user_guide heavy tests * fixed start * added ext report codes * addressed Congjian comments (1of2) * added documentation for hirarchical flag * rework update cashflow (#560) * updated CashFlow submodule id * Code Clean Up (#567) * clean up reassignSampledVarsPbToFullyCorrVars, because we add it to the Sampler base class * move reassignPbWeightToCorrelatedVars to Sampler base class * fix comments for PR#560 * fix pb in Monte Carlo and fix generic code interface test * fix cluster tests (#566) * fix cluster tests * fix parallelPP test * Alfoa/dataobject rework finalize ensemble (#565) * Closes #541 * Update GenericCodeInterface.py * fixed tester (#528) * ensemble model pb weights for variables coming from functions * fixed single-value-duplication error for SKL ROMs (#555) * fixed single-value-duplication error * fixed test framework/ensembleModelTests.testEnsembleModelWith2CodesAndAliasAndOptionalOutputs * modified order of input output to avoid regolding * Reducing DataObject Attribute Functionality (#278) * Enabling the data attribute tests and fixing the operators for PointSets. TODO: Break the data_attributes test down to be more granular and fix the outputPivotValue on the HistorySets. * Splitting the test files for the DataObject attributes and correcting some malformations in the subsequent input files. TODO: Fix the attributes for the history set when operating from a Model. * Fixing HistorySet data attribute test case to look for the correct file. * Correcting attributions for data object tests. maljdan had only moved the files. The original tests were designed by others. TODO: verify if test results are valid or the result of incorrect gold files. * Reducing the number of DataObjects needed in the shared suite of DataObject attribute tests. * Regolding the DataObject HistorySet attributes files to respect the outputPivotVal specified for stories2. * Picking up where I left off, trying to recall what modifications still need to be done to the HistorySet. * Regolding a test case on data attributes, removing dead code from the HistorySet and updating some aspects of the PointSet. * Removing data attribute feature set with explanation in comments. Cleaning old code. * Regolding fixed test case. * Reverting changes to ensemble test and accommodating unstructured inputs. * addressed misunderstanding in HistorySet * added HSToPSOperator PP * added documentation for new interface * finished new PP * addressed first comments * addressed Congjian's comments * updated XSD * moving ahead * fixed test framework/ensembleModelTests.testEnsembleModelLinearThreadWithTimeSeries * fixed framework/ensembleModelTests.testEnsembleModelLinearParallelWithOptimizer * fixed framework/CodeInterfaceTests.DymolaTestTimeDepNoExecutableEnsembleModel * fixed framework/PostProcessors/InterfacedPostProcessor.metadataUsageInInterfacePP * fixed new test files coming from devel * updated InterfacedPP HStoPSOperator * fixed xsd * added documenation for DataSet * added conversion script from old HDF5 to new HDF5 * Update DataObjects.xsd * remove white space * Update database_data.tex * Update postprocessor.tex * removed unuseful __init__ in Melcor interface * addressed Congjian's comments * ok * moving * moving ahead * ok * moving * aaaa * ok * a * CSV printing speedup (#570) * Closes #541 * Update GenericCodeInterface.py * fixed * fixed tester (#528) * ensemble model pb weights for variables coming from functions * fixed single-value-duplication error for SKL ROMs (#555) * fixed single-value-duplication error * xsd * fixed type * fixed test framework/ensembleModelTests.testEnsembleModelWith2CodesAndAliasAndOptionalOutputs * modified order of input output to avoid regolding * ok * Reducing DataObject Attribute Functionality (#278) * Enabling the data attribute tests and fixing the operators for PointSets. TODO: Break the data_attributes test down to be more granular and fix the outputPivotValue on the HistorySets. * Splitting the test files for the DataObject attributes and correcting some malformations in the subsequent input files. TODO: Fix the attributes for the history set when operating from a Model. * Fixing HistorySet data attribute test case to look for the correct file. * Correcting attributions for data object tests. maljdan had only moved the files. The original tests were designed by others. TODO: verify if test results are valid or the result of incorrect gold files. * Reducing the number of DataObjects needed in the shared suite of DataObject attribute tests. * Regolding the DataObject HistorySet attributes files to respect the outputPivotVal specified for stories2. * Picking up where I left off, trying to recall what modifications still need to be done to the HistorySet. * Regolding a test case on data attributes, removing dead code from the HistorySet and updating some aspects of the PointSet. * Removing data attribute feature set with explanation in comments. Cleaning old code. * Regolding fixed test case. * Reverting changes to ensemble test and accommodating unstructured inputs. * addressed misunderstanding in HistorySet * added HSToPSOperator PP * added documentation for new interface * finished new PP * addressed first comments * addressed Congjian's comments * updated XSD * moving ahead * fixed test framework/ensembleModelTests.testEnsembleModelLinearThreadWithTimeSeries * last one almost done * fixed framework/ensembleModelTests.testEnsembleModelLinearParallelWithOptimizer * fixed framework/CodeInterfaceTests.DymolaTestTimeDepNoExecutableEnsembleModel * almost done * fixed framework/PostProcessors/InterfacedPostProcessor.metadataUsageInInterfacePP * fixed new test files coming from devel * updated InterfacedPP HStoPSOperator * fixed xsd * added documenation for DataSet * added conversion script from old HDF5 to new HDF5 * Update DataObjects.xsd * remove white space * Update database_data.tex * testing printing * reverted to_csv for ND dataset. Need a good test for multiple-index dataset printing. * added benchmark results for numpy case * Rework Ensemble for Indexes (#571) * got the test case working WITH picard iteration, now working to sort it out so picard is not used * works without picard * cleanup * fix for single residual values * order change for xsd sake * added user guide entry, added some slight additional testing * gold file * adding stuff * xsd fix * stash for syncing * added tips and tricks in docs * cleanup * some comments addressed * changed all raven entities to use UpperCaseCapitalization in sentences * ok * try * finished DMD * edit ensemble test * Alfoa/performance improvement ensemble model (#581) * removed piclking of TargetEvaluation * removed pickling of Optional Outputs and removed specialization in the Step for ensembleModel * changed name of local target evaluation * changed name of local target evaluation * addressed Congjian's comments and all execpt one of Paul's ones * fixed remove in assembler * graph time dep * graph time dep * ET TD * add missing files * resolve comments * edits * Talbpaul/rework maxqsize (#584) * Closes #541 * Update GenericCodeInterface.py * fixed * fixed tester (#528) * ensemble model pb weights for variables coming from functions * stash * fixed failing tests by adding maxqueuesize back to 1 * test revisions added * revision author name * edits * added temporary walltime for codes * edits * fixed conflicts (#595) * fixed conflicts * fixed typo * fixed one of the categorical cases * fixed restart * library fixes * fixed netcdf4 specification * fixing numpy version again * trying numpy 1.14 * numpy 1.11 * test for missing variables in restart added * numpy 1.14.0 * 1.14 with inclusion in conda list * Update existing_interfaces.tex * restarting with more conda version checking * Skipping ARMA reseed test * [rework] ExternalXML in RAVEN Code Interface (#596) * Closes #541 * Update GenericCodeInterface.py * fixed * fixed tester (#528) * ensemble model pb weights for variables coming from functions * cherry picking, test is not passing * fixed merge for rework * Optimizer inherits from Sampler (#600) * Job Profiling (#586) * implements job profiling * review comments * locking down xarray library versions * library change * pandas version lock * shuffled libraries according to discussion, pinned netcdf4 * trying pip package specs * changed the printing strategy of profiles (#601) * changed the printing strategy of profiles * Update Runner.py * added constant reading into solution export, also added test to verbosity test * removed debug prints * dummy change to run tests * modified spline...almost done * moving forward for Crow * ok * almost done * ok * Add "long" data type compatability (#590) * Closes #541 * Update GenericCodeInterface.py * fixed * fixed tester (#528) * ensemble model pb weights for variables coming from functions * added long to integer options, added unit test coverage * version control * testing library versions in RavenUtils * found consistent library set * revert utils changes * patched up file closing for Windows * moved 2 tests to unordered csv * remove directory printing for history sets * removed path from history set CSVs * added verbosity for crosschecking * temporarily skipping time warping cluster tests due to Windows failures * returned outstream main printing * reducing strictness of user guide forward grid sampling test * fixed rel err in unordered csv differ * working out bugs for UnorderedCSVDiffer * tests passing, had to introduce zero threshold for two basic stats tests * increased debugging verbosity for debugging linux failures * faster version of thresholding, think it works for all types * now with less debug * fixed nested XML reading (#603) * finished fit for Spline * aaa * Fixes optimizer-runs-raven bug (#610) * commented out initial setting of point probability to prevent unintended downstream interactions * added verbosity to potential type failing, and regolded new prefixes (other values did not change) * added a test * added second test for spline interpolator * ok now working on DMD * Rw data final naming (#614) * relocated utils, dataobject unit tests and renamed dataobjects * relocated Files and Distributions unit tests as well * copied necessary files back to main test dir * ok * Alfoa/scale6.2 (#608) * added the parser * moving * ok * finished interface * added test + initial documentation * added documentation for SCALE coupling * missing regression tests * addressed Diego's comments * added test for Combine TRITON and ORIGEN * added test for combined triton origen + added possibility in CustomSampler to use the functions * addressed diego comments again * revert old commit and address Diego's final comments * type in tests file * added prereq in testExternalReseed to avoid conflict in parallel test execution * updated XSD schema * reset moose * cleaning up * Improved UnorderedCSVDiffer speed (#615) * cleaned up * cleanup * checked out dataobject-rework tests file * adding printing * ok * edits * moving * PRA plugin manual first edits * added math utils * edits * moving ahead * edits * add manual for the DataClassifier in PRAPlugin * almost done * edits * edit * moving * add manual for markov categorical distribution * edit * ok * edits * ok * added description * removed CSVs and added documentation * reverted modificaiton in basic stats * edits * edits * edits * edits * removed files * removed files * added test for PolyExponential * missing test for DMD * added tests for DMD * addressed Diego's comments * added in TestInfo * regolded PolyExponential tests since I shortened the time series * added format for printinig * tolerances * fixed coeff printing on scree for polyExp Poly * added minimum scipy version * remove xml checker for DMD since the eigenvalues are not necessary ordered and consequentially a spurious diff can happen * type * added importe of differential_evolution only where is required * modified tests * update test for markov distribution * added comments * expand install script for conda 4.4 and beyond (#618) * expand install script for conda 4.4 and beyond * added explanatory comments * edits * change the data classifier to use the new structure of DataObjects * added missing files * Multi-sample Variables (vector inputs) (#625) * Optimizer inherits from Sampler * first implementation: by default copy value to all entries in vector variable, works * finished test and implementation of simple repeat-value vector variable sampling * added InputSpecs for optimizer, tests pass * got input params working for optimizer * first implementation: by default copy value to all entries in vector variable, works * finished test and implementation of simple repeat-value vector variable sampling * added InputSpecs for optimizer, tests pass * got input params working for optimizer * stash * fixed gradient calculation to include vectors, all non-vector tests passing * fixed gradient calculation to include vectors, all non-vector tests passing, conditional sizing for vector grad * boundary condition checking, all passing * redundant trajectories, all passing * same coordinate check * dot product step sizing * stochastic engine is incorrectly sized; currently each entry in vector is being perturbed identically. Needs work. * working on constraints, convergence is really poor, needs more help * first boundary conditions (internal) working, although type change in precond test * constraints fully done, only precond has a problem still, vector still not converging well * debugging difference between all scalars and vector * vector * time parabola model * fixed initial step size * working, although as a vector is a bit slower than all scalars * vector is faster than scalar, reduced scale of tests (and better solution) * all passing, but precond, which is having the type error still * cleaned up, removed scalar comparison test, fixed precond test * cleanup * last bit of cleanup, all tests passing * stash, it appears customsampler and datasets are not yet compatible * xsd * stash, cannot handle specific requests * reloading from dataset csv works be default * fixed unit test, vector test * xsd * CustomSampler handles Point,History,Data sets * cleanup * cleanup * updated custom sampler description docs * Optimizer uses Custom sampler with vector variables for initial points * unnecessarily-tested file * initial round of review comments * script for disclaimer adding, also added to models in optimizing test dir * increased verbosity for test debug * more verbosity for debugging * gold standard agrees with all test machines, personal cluster profile (my desktop find minimum in traj 1 of 36 instead of 220ish) * new golds * exposed RNG to RAVEN python...swig (#630) * exposed RNG to RAVEN python...swig * fixed for now dist stoch enviroment * added more missing files * missing files more * added last file * edits * edits * updated xsd * fix xsd for Markov categorical distribution * remove duplicated lines * remove duplicated lines * edits * edits * edits * Vector constants (#632) * shape from node to attribute * constants can now be vectors too * necessary Sampler and Optimizer changes * extracted common constant reading for sampler, optimizer * including string custom vector vars * vector constant works in rrr with optimizer * fix data classifier for HistorySet * fix typo * delete trailing whitespace * edits * pre-merge review comments addressed: framework/DataObjects (#646) * pre-merge review comments addressed for modules in framework/DataObjects, with the exception of merging DataObject into DataSet * removed hierarchal unecessary use of [:] * remainder of comments addressed * modified test for dataobject rework * fixes * edits after first round of review * removed useless files * cleaned files * removed keyword * modified docs * edits * edits * edits * edits * rm dataFile for MarkovCategorical dist, fix code to handle MAAP5 interface without executable * fix seeding for markov model * fix test for data classifier * resolve part of the comments * update raven user manual * capitalize the class name * update docs * update docs build * rename files * update FT tests * update ET tests * update class name * fix data object * merge documents for ETImportor PP * merge documents of DataClassifier and FTImporter PP * fix markov model with internal RNG class, and regold tests due to merge devel with issue #672 * first round of reseolved comments * delete whitespaces --- developer_tools/XSDSchemas/Distributions.xsd | 24 +- doc/user_manual/ProbabilityDistributions.tex | 66 +++ doc/user_manual/model.tex | 8 +- doc/user_manual/postprocessor.tex | 421 +++++++++++++++- framework/DataObjects/DataSet.py | 7 + framework/Distributions.py | 132 +++++ framework/Models/Code.py | 39 +- framework/PostProcessors/DataClassifier.py | 305 +++++++++++ framework/PostProcessors/ETImporter.py | 442 ++-------------- framework/PostProcessors/ETStructure.py | 473 ++++++++++++++++++ framework/PostProcessors/FTGate.py | 279 +++++++++++ framework/PostProcessors/FTImporter.py | 131 +++++ framework/PostProcessors/FTStructure.py | 148 ++++++ framework/PostProcessors/SampleSelector.py | 2 +- framework/PostProcessors/__init__.py | 9 +- framework/utils/InputData.py | 2 +- framework/utils/xmlUtils.py | 15 + plugins/PRAplugin/doc/Introduction.tex | 14 + plugins/PRAplugin/doc/Makefile | 21 + plugins/PRAplugin/doc/figures/ET.pdf | Bin 0 -> 7587 bytes plugins/PRAplugin/doc/figures/FT.pdf | Bin 0 -> 7801 bytes plugins/PRAplugin/doc/figures/RBD.pdf | Bin 0 -> 7798 bytes plugins/PRAplugin/doc/figures/markov.pdf | Bin 0 -> 12228 bytes .../PRAplugin/doc/include/DataClassifier.tex | 12 + .../PRAplugin/doc/include/ETdataImporter.tex | 19 + plugins/PRAplugin/doc/include/ETmodel.tex | 95 ++++ .../PRAplugin/doc/include/FTdataImporter.tex | 28 ++ plugins/PRAplugin/doc/include/FTmodel.tex | 108 ++++ plugins/PRAplugin/doc/include/MarkovModel.tex | 68 +++ plugins/PRAplugin/doc/include/RBDmodel.tex | 103 ++++ plugins/PRAplugin/doc/make_pra_plugin_docs.sh | 39 ++ plugins/PRAplugin/doc/user_manual.bib | 14 + plugins/PRAplugin/doc/user_manual.tex | 263 ++++++++++ plugins/PRAplugin/src/ETModel.py | 105 ++++ plugins/PRAplugin/src/FTModel.py | 188 +++++++ plugins/PRAplugin/src/GraphModel.py | 246 +++++++++ plugins/PRAplugin/src/MarkovModel.py | 201 ++++++++ plugins/PRAplugin/tests/ETmodel/eventTree.xml | 33 ++ .../PRAplugin/tests/ETmodelTD/eventTree.xml | 33 ++ plugins/PRAplugin/tests/FTmodel/FT1.xml | 58 +++ plugins/PRAplugin/tests/FTmodelTD/FT1.xml | 22 + .../PRAplugin/tests/dataClassifier/THmodel.py | 51 ++ .../tests/dataClassifier/eventTree.xml | 33 ++ .../tests/dataClassifier/func_ACC.py | 21 + .../tests/dataClassifier/func_LPI.py | 21 + .../tests/dataClassifier/func_LPR.py | 21 + .../tests/dataClassifierHS/THmodelTD.py | 64 +++ .../tests/dataClassifierHS/eventTree.xml | 33 ++ .../tests/dataClassifierHS/func_ACC.py | 23 + .../tests/dataClassifierHS/func_LPI.py | 23 + .../tests/dataClassifierHS/func_LPR.py | 23 + .../PRAplugin/tests/ensembleDiscrete/FT1.xml | 17 + .../PRAplugin/tests/ensembleDiscrete/FT2.xml | 17 + .../PRAplugin/tests/ensembleDiscrete/RBD.xml | 16 + .../tests/ensembleDiscrete/eventTree.xml | 42 ++ plugins/PRAplugin/tests/ensembleMixed/FT1.xml | 17 + plugins/PRAplugin/tests/ensembleMixed/FT2.xml | 17 + plugins/PRAplugin/tests/ensembleMixed/RBD.xml | 16 + .../tests/ensembleMixed/eventTree.xml | 42 ++ .../tests/gold/ETmodel/Print_sim_PS.csv | 11 + .../tests/gold/ETmodelTD/Print_sim_PS.csv | 11 + .../tests/gold/FTmodel/Print_sim_PS.csv | 11 + .../tests/gold/FTmodelTD/Print_sim_PS.csv | 51 ++ .../tests/gold/dataClassifier/Print_ET_PS.csv | 9 + .../gold/dataClassifier/Print_sim_PS.csv | 11 + .../gold/dataClassifierHS/Print_sim_PS_0.csv | 4 + .../gold/dataClassifierHS/Print_sim_PS_3.csv | 12 + .../gold/dataClassifierHS/Print_sim_PS_7.csv | 7 + .../gold/dataClassifierHS/Print_sim_PS_9.csv | 10 + .../gold/ensembleDiscrete/Print_sim_PS.csv | 100 ++++ .../tests/gold/ensembleMixed/Print_sim_PS.csv | 100 ++++ .../tests/gold/graphModel/Print_sim_PS.csv | 21 + .../tests/gold/graphModelTD/Print_sim_PS.csv | 101 ++++ .../gold/markovModel_2states/Print_sim_PS.csv | 101 ++++ .../markovModel_2states_tau/Print_sim_PS.csv | 101 ++++ .../gold/markovModel_3states/Print_sim_PS.csv | 101 ++++ .../Print_sim_PS.csv | 101 ++++ .../Print_sim_PS.csv | 101 ++++ .../PRAplugin/tests/graphModel/graphTest.xml | 35 ++ .../tests/graphModelTD/graphTestTD.xml | 19 + plugins/PRAplugin/tests/test_ETmodel.xml | 76 +++ plugins/PRAplugin/tests/test_ETmodel_TD.xml | 79 +++ plugins/PRAplugin/tests/test_FTmodel.xml | 80 +++ plugins/PRAplugin/tests/test_FTmodel_TD.xml | 83 +++ .../test_dataClassifier_postprocessor.xml | 140 ++++++ .../test_dataClassifier_postprocessor_HS.xml | 140 ++++++ .../tests/test_ensemblePRAModel_discrete.xml | 189 +++++++ .../tests/test_ensemblePRAModel_mixed.xml | 193 +++++++ plugins/PRAplugin/tests/test_graphModel.xml | 81 +++ .../PRAplugin/tests/test_graphModel_TD.xml | 82 +++ .../tests/test_markovModel_2states.xml | 93 ++++ .../tests/test_markovModel_2states_tau.xml | 93 ++++ .../tests/test_markovModel_3states.xml | 99 ++++ .../test_markovModel_3states_complexTrans.xml | 93 ++++ .../test_markovModel_3states_instantTrans.xml | 93 ++++ plugins/PRAplugin/tests/tests | 93 ++++ tests/crow/test_utils.py | 1 + .../gold/test_markov/Grid_dump.csv | 4 + .../gold/test_markov/MC_dump.csv | 101 ++++ .../test_distributionsMarkov.xml | 137 +++++ .../Distributions/test_markov/simple.py | 18 + tests/framework/Distributions/tests | 8 +- .../ETimporterExpand/eventTree.xml | 33 ++ .../ETimporter_3branches/eventTree.xml | 41 ++ .../eventTree.xml | 41 ++ .../eventTree.xml | 41 ++ .../gold/ETimporter/PrintPS.xml | 1 - .../gold/ETimporterCoupledET/PrintPS.xml | 1 - .../gold/ETimporterDefineBranch/PrintPS.xml | 1 - .../gold/ETimporterExpand/PrintPS.csv | 9 + .../ETimporterSymbolicSequence/PrintPS.xml | 1 - .../gold/ETimporter_3branches/PrintPS.csv | 7 + .../PrintPS.csv | 7 + .../PrintPS.csv | 19 + .../test_ETimporter.xml | 3 +- .../test_ETimporterMultipleET.xml | 3 +- .../test_ETimporterSymbolic.xml | 3 +- .../test_ETimporter_3branches.xml | 56 +++ ...test_ETimporter_3branches_NewNumbering.xml | 56 +++ ...porter_3branches_NewNumbering_expanded.xml | 56 +++ .../test_ETimporter_DefineBranch.xml | 3 +- .../test_ETimporter_expand.xml | 56 +++ .../ETimporterPostProcessor/tests | 27 +- .../FTimporter_and/FT_and.xml | 12 + .../FTimporter_and_withNOT/FT_and_NOT.xml | 16 + .../FT_and_NOT_embedded.xml | 13 + .../FT_and_withNOT_withNOT_embedded.xml | 18 + .../FTimporter_atleast/FT_atleast.xml | 12 + .../FTimporter_cardinality/FT_cardinality.xml | 13 + .../FTimporter_component/FT1.xml | 47 ++ .../FTimporter_doubleNot.xml | 11 + .../FTimporter_iff/FT_iff.xml | 10 + .../FTimporter_imply/FT_imply.xml | 10 + .../trans_model_data.xml | 21 + .../FTimporter_multipleFTs/trans_one.xml | 16 + .../FTimporter_multipleFTs/trans_two.xml | 19 + .../FTimporter_nand/FT_nand.xml | 11 + .../FTimporter_nor/FT_nor.xml | 11 + .../FTimporter_not/FTimporter_not.xml | 9 + .../FTimporter_or/FT_or.xml | 11 + .../FT_or_houseEvent.xml | 14 + .../FTimporter_xor/FT_xor.xml | 11 + .../gold/FTimporter_and/PrintPS.csv | 9 + .../gold/FTimporter_and_withNOT/PrintPS.csv | 9 + .../PrintPS.csv | 9 + .../PrintPS.csv | 9 + .../gold/FTimporter_atleast/PrintPS.csv | 17 + .../gold/FTimporter_cardinality/PrintPS.csv | 33 ++ .../gold/FTimporter_component/PrintPS.csv | 65 +++ .../gold/FTimporter_doubleNot/PrintPS.csv | 3 + .../gold/FTimporter_iff/PrintPS.csv | 5 + .../gold/FTimporter_imply/PrintPS.csv | 5 + .../gold/FTimporter_multipleFTs/PrintPS.csv | 5 + .../gold/FTimporter_nand/PrintPS.csv | 9 + .../gold/FTimporter_nor/PrintPS.csv | 9 + .../gold/FTimporter_not/PrintPS.csv | 3 + .../gold/FTimporter_or/PrintPS.csv | 9 + .../gold/FTimporter_or_houseEvent/PrintPS.csv | 5 + .../gold/FTimporter_xor/PrintPS.csv | 9 + .../test_FTimporter_and.xml | 56 +++ .../test_FTimporter_and_withNOT.xml | 56 +++ .../test_FTimporter_and_withNOT_embedded.xml | 56 +++ ...Timporter_and_withNOT_withNOT_embedded.xml | 56 +++ .../test_FTimporter_atleast.xml | 56 +++ .../test_FTimporter_cardinality.xml | 56 +++ .../test_FTimporter_component.xml | 56 +++ .../test_FTimporter_doubleNot.xml | 56 +++ .../test_FTimporter_iff.xml | 56 +++ .../test_FTimporter_imply.xml | 56 +++ .../test_FTimporter_multipleFTs.xml | 60 +++ .../test_FTimporter_nand.xml | 56 +++ .../test_FTimporter_nor.xml | 56 +++ .../test_FTimporter_not.xml | 56 +++ .../test_FTimporter_or.xml | 56 +++ .../test_FTimporter_or_houseEvent.xml | 56 +++ .../test_FTimporter_xor.xml | 56 +++ .../FTimporterPostProcessor/tests | 87 ++++ 177 files changed, 9210 insertions(+), 460 deletions(-) create mode 100644 framework/PostProcessors/DataClassifier.py create mode 100644 framework/PostProcessors/ETStructure.py create mode 100644 framework/PostProcessors/FTGate.py create mode 100644 framework/PostProcessors/FTImporter.py create mode 100644 framework/PostProcessors/FTStructure.py create mode 100644 plugins/PRAplugin/doc/Introduction.tex create mode 100644 plugins/PRAplugin/doc/Makefile create mode 100644 plugins/PRAplugin/doc/figures/ET.pdf create mode 100644 plugins/PRAplugin/doc/figures/FT.pdf create mode 100644 plugins/PRAplugin/doc/figures/RBD.pdf create mode 100644 plugins/PRAplugin/doc/figures/markov.pdf create mode 100644 plugins/PRAplugin/doc/include/DataClassifier.tex create mode 100644 plugins/PRAplugin/doc/include/ETdataImporter.tex create mode 100644 plugins/PRAplugin/doc/include/ETmodel.tex create mode 100644 plugins/PRAplugin/doc/include/FTdataImporter.tex create mode 100644 plugins/PRAplugin/doc/include/FTmodel.tex create mode 100644 plugins/PRAplugin/doc/include/MarkovModel.tex create mode 100644 plugins/PRAplugin/doc/include/RBDmodel.tex create mode 100755 plugins/PRAplugin/doc/make_pra_plugin_docs.sh create mode 100644 plugins/PRAplugin/doc/user_manual.bib create mode 100644 plugins/PRAplugin/doc/user_manual.tex create mode 100644 plugins/PRAplugin/src/ETModel.py create mode 100644 plugins/PRAplugin/src/FTModel.py create mode 100644 plugins/PRAplugin/src/GraphModel.py create mode 100644 plugins/PRAplugin/src/MarkovModel.py create mode 100644 plugins/PRAplugin/tests/ETmodel/eventTree.xml create mode 100644 plugins/PRAplugin/tests/ETmodelTD/eventTree.xml create mode 100644 plugins/PRAplugin/tests/FTmodel/FT1.xml create mode 100644 plugins/PRAplugin/tests/FTmodelTD/FT1.xml create mode 100644 plugins/PRAplugin/tests/dataClassifier/THmodel.py create mode 100644 plugins/PRAplugin/tests/dataClassifier/eventTree.xml create mode 100644 plugins/PRAplugin/tests/dataClassifier/func_ACC.py create mode 100644 plugins/PRAplugin/tests/dataClassifier/func_LPI.py create mode 100644 plugins/PRAplugin/tests/dataClassifier/func_LPR.py create mode 100644 plugins/PRAplugin/tests/dataClassifierHS/THmodelTD.py create mode 100644 plugins/PRAplugin/tests/dataClassifierHS/eventTree.xml create mode 100644 plugins/PRAplugin/tests/dataClassifierHS/func_ACC.py create mode 100644 plugins/PRAplugin/tests/dataClassifierHS/func_LPI.py create mode 100644 plugins/PRAplugin/tests/dataClassifierHS/func_LPR.py create mode 100644 plugins/PRAplugin/tests/ensembleDiscrete/FT1.xml create mode 100644 plugins/PRAplugin/tests/ensembleDiscrete/FT2.xml create mode 100644 plugins/PRAplugin/tests/ensembleDiscrete/RBD.xml create mode 100644 plugins/PRAplugin/tests/ensembleDiscrete/eventTree.xml create mode 100644 plugins/PRAplugin/tests/ensembleMixed/FT1.xml create mode 100644 plugins/PRAplugin/tests/ensembleMixed/FT2.xml create mode 100644 plugins/PRAplugin/tests/ensembleMixed/RBD.xml create mode 100644 plugins/PRAplugin/tests/ensembleMixed/eventTree.xml create mode 100644 plugins/PRAplugin/tests/gold/ETmodel/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/ETmodelTD/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/FTmodel/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/FTmodelTD/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/dataClassifier/Print_ET_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/dataClassifier/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_0.csv create mode 100644 plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_3.csv create mode 100644 plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_7.csv create mode 100644 plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_9.csv create mode 100644 plugins/PRAplugin/tests/gold/ensembleDiscrete/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/ensembleMixed/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/graphModel/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/graphModelTD/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/markovModel_2states/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/markovModel_2states_tau/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/markovModel_3states/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/markovModel_3states_complexTrans/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/gold/markovModel_3states_instantTrans/Print_sim_PS.csv create mode 100644 plugins/PRAplugin/tests/graphModel/graphTest.xml create mode 100644 plugins/PRAplugin/tests/graphModelTD/graphTestTD.xml create mode 100644 plugins/PRAplugin/tests/test_ETmodel.xml create mode 100644 plugins/PRAplugin/tests/test_ETmodel_TD.xml create mode 100644 plugins/PRAplugin/tests/test_FTmodel.xml create mode 100644 plugins/PRAplugin/tests/test_FTmodel_TD.xml create mode 100644 plugins/PRAplugin/tests/test_dataClassifier_postprocessor.xml create mode 100644 plugins/PRAplugin/tests/test_dataClassifier_postprocessor_HS.xml create mode 100644 plugins/PRAplugin/tests/test_ensemblePRAModel_discrete.xml create mode 100644 plugins/PRAplugin/tests/test_ensemblePRAModel_mixed.xml create mode 100644 plugins/PRAplugin/tests/test_graphModel.xml create mode 100644 plugins/PRAplugin/tests/test_graphModel_TD.xml create mode 100644 plugins/PRAplugin/tests/test_markovModel_2states.xml create mode 100644 plugins/PRAplugin/tests/test_markovModel_2states_tau.xml create mode 100644 plugins/PRAplugin/tests/test_markovModel_3states.xml create mode 100644 plugins/PRAplugin/tests/test_markovModel_3states_complexTrans.xml create mode 100644 plugins/PRAplugin/tests/test_markovModel_3states_instantTrans.xml create mode 100644 plugins/PRAplugin/tests/tests create mode 100644 tests/framework/Distributions/gold/test_markov/Grid_dump.csv create mode 100644 tests/framework/Distributions/gold/test_markov/MC_dump.csv create mode 100644 tests/framework/Distributions/test_distributionsMarkov.xml create mode 100644 tests/framework/Distributions/test_markov/simple.py create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/ETimporterExpand/eventTree.xml create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches/eventTree.xml create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches_NewNumbering/eventTree.xml create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches_NewNumbering_expanded/eventTree.xml delete mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter/PrintPS.xml delete mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterCoupledET/PrintPS.xml delete mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterDefineBranch/PrintPS.xml create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterExpand/PrintPS.csv delete mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterSymbolicSequence/PrintPS.xml create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches/PrintPS.csv create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches_NewNumbering/PrintPS.csv create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches_NewNumbering_expanded/PrintPS.csv create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches.xml create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches_NewNumbering.xml create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches_NewNumbering_expanded.xml create mode 100644 tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_expand.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and/FT_and.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT/FT_and_NOT.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT_embedded/FT_and_NOT_embedded.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT_withNOT_embedded/FT_and_withNOT_withNOT_embedded.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_atleast/FT_atleast.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_cardinality/FT_cardinality.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_component/FT1.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_doubleNot/FTimporter_doubleNot.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_iff/FT_iff.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_imply/FT_imply.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_model_data.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_one.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_two.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_nand/FT_nand.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_nor/FT_nor.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_not/FTimporter_not.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_or/FT_or.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_or_houseEvent/FT_or_houseEvent.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_xor/FT_xor.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT_embedded/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT_withNOT_embedded/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_atleast/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_cardinality/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_component/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_doubleNot/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_iff/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_imply/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_multipleFTs/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_nand/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_nor/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_not/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_or/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_or_houseEvent/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_xor/PrintPS.csv create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT_embedded.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT_withNOT_embedded.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_atleast.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_cardinality.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_component.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_doubleNot.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_iff.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_imply.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_multipleFTs.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_nand.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_nor.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_not.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_or.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_or_houseEvent.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_xor.xml create mode 100644 tests/framework/PostProcessors/FTimporterPostProcessor/tests diff --git a/developer_tools/XSDSchemas/Distributions.xsd b/developer_tools/XSDSchemas/Distributions.xsd index 0146279788..287fc5e68d 100644 --- a/developer_tools/XSDSchemas/Distributions.xsd +++ b/developer_tools/XSDSchemas/Distributions.xsd @@ -19,10 +19,11 @@ + - - - + + + @@ -198,13 +199,28 @@ - + + + + + + + + + + + + + + + + diff --git a/doc/user_manual/ProbabilityDistributions.tex b/doc/user_manual/ProbabilityDistributions.tex index 767db49f7e..37719b4dc4 100644 --- a/doc/user_manual/ProbabilityDistributions.tex +++ b/doc/user_manual/ProbabilityDistributions.tex @@ -903,6 +903,72 @@ \subsubsection{1-Dimensional Discrete Distributions.} \end{lstlisting} +\paragraph{Markov Categorical Distribution} +\label{subsec:markovCategorical} + +The \textbf{MarkovCategorical} distribution is a specific discrete categorical distribution describes +a random variable that can have $K$ possible outcomes, based on the steady state probabilities provided by +Markov model. +% +\begin{itemize} + \item \xmlNode{transition}, \xmlDesc{float, optional field}, the transition matrix of given Markov model. + \item \xmlNode{dataFile}, \xmlDesc{string, optional xml node}. The path for the given data file, i.e. the transition matrix. + In this node, the following attribute should be specified: + \begin{itemize} + \item \xmlAttr{fileType}, \xmlDesc{string, optional field}, the type of given data file, default is `csv'. + \end{itemize} + \nb Either \xmlNode{transition} or \xmlNode{dataFile} is required to provide the transition matrix. + \item \xmlNode{workingDir}, \xmlDesc{string, optional field}, the path of working directory + \item \xmlNode{state}, \xmlDesc{required xml node}. The output from this state indicates + the probability for outcome 1. + In this node, the following attribute should be specified: + \begin{itemize} + \item \xmlAttr{outcome}, \xmlDesc{float, required field}, outcome value. + \item \xmlAttr{index}, \xmlDesc{integer, required field}, the index of steady state probabilities corresponding to the transition matrix. + \end{itemize} + \item \xmlNode{state}, \xmlDesc{required xml node}. The output from this state indicates + the probability for outcome 2. + In this node, the following attribute should be specified: + \begin{itemize} + \item \xmlAttr{outcome}, \xmlDesc{float, required field}, outcome value. + \item \xmlAttr{index}, \xmlDesc{integer, required field}, the index of steady state probabilities corresponding to the transition matrix. + \end{itemize} + \item ... + \item \xmlNode{state}, \xmlDesc{required xml node}. The output from this state indicates + the probability for outcome K. + In this node, the following attribute should be specified: + \begin{itemize} + \item \xmlAttr{outcome}, \xmlDesc{float, required field}, outcome value. + \item \xmlAttr{index}, \xmlDesc{integer, required field}, the index of steady state probabilities corresponding to the transition matrix. + \end{itemize} + +\end{itemize} + +\textbf{Example:} + +\begin{lstlisting}[style=XML] + + ... + + ... + + + + -1.1 0.8 0.7 + 0.8 -1.4 0.2 + 0.3 0.6 -0.9 + + + + + + ... + + ... + +\end{lstlisting} + + %%%%%% N-Dimensional Probability distributions \subsection{N-Dimensional Probability Distributions} diff --git a/doc/user_manual/model.tex b/doc/user_manual/model.tex index 9198904646..f5f9bbe998 100644 --- a/doc/user_manual/model.tex +++ b/doc/user_manual/model.tex @@ -937,7 +937,7 @@ \section{Models} %Material|Fuel|thermal_conductivity \subsection{Code} \label{subsec:models_code} -As already mentioned, the \textbf{Code} model represents an external system +The \textbf{Code} model represents an external system software employing a high fidelity physical model. % The link between RAVEN and the driven code is performed at run-time, through @@ -971,6 +971,10 @@ \subsection{Code} \begin{itemize} \item \xmlNode{executable} \xmlDesc{string, required field} specifies the path of the executable to be used. + + \item \xmlNode{walltime} \xmlDesc{string, optional field} specifies the maximum + allowed run time of the code; if the code running time is greater than the specified + walltime then the code run is stopped. The stopped run is then considered as if it crashed. % \nb Either an absolute or relative path can be used. \item \aliasSystemDescription{Code} @@ -1535,7 +1539,7 @@ \subsection{EnsembleModel} % The user can specify as many \xmlNode{Output} (s) as needed. The optional \xmlNode{Output}s can be of both classes ``DataObjects'' and ``Databases'' - (e.g. \textit{PointSet}, \textit{HistorySet}, \textit{DataSet}, \textit{HDF5}). + (e.g. \textit{PointSet}, \textit{HistorySet}, \textit{DataSet}, \textit{HDF5}) \nb \textbf{The \xmlNode{Output} (s) here specified MUST be listed in the Step in which the EnsembleModel is used.} \end{itemize} % diff --git a/doc/user_manual/postprocessor.tex b/doc/user_manual/postprocessor.tex index cfae5ccb28..4857ab8b95 100644 --- a/doc/user_manual/postprocessor.tex +++ b/doc/user_manual/postprocessor.tex @@ -23,6 +23,7 @@ \subsection{PostProcessor} \item \textbf{DataMining} \item \textbf{Metric} \item \textbf{CrossValidation} + \item \textbf{DataClassifier} \item \textbf{ValueDuration} \item \textbf{SampleSelector} %\item \textbf{PrintCSV} @@ -1759,8 +1760,49 @@ \subsubsection{RavenOutput} \subsubsection{ETImporter} \label{ETImporterPP} The \textbf{ETImporter} post-processor has been designed to import Event-Tree (ET) object into -RAVEN. This is performed by saving the structure of the ET (from file) as a \textbf{PointSet} (only \textbf{PointSet} are allowed). -Since an ET is a static Boolean logic structure, the \textbf{PointSet} is structured as follows: +RAVEN. Since several ET file formats are available, as of now only the OpenPSA format +(see https://open-psa.github.io/joomla1.5/index.php.html) is supported. As an example, +the OpenPSA format ET is shown below: + +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=ET in OpenPSA format., label=lst:ETModel] + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +\end{lstlisting} + +This is performed by saving the structure of the ET (from file) as a \textbf{PointSet} +(only \textbf{PointSet} are allowed), since an ET is a static Boolean logic structure. Each realization in the +\textbf{PointSet} represents a unique accident sequence of the ET, and the \textbf{PointSet} is structured as follows: \begin{itemize} \item Input variables of the \textbf{PointSet} are the branching conditions of the ET. The value of each input variable can be: \begin{itemize} @@ -1770,13 +1812,64 @@ \subsubsection{ETImporter} \end{itemize} \item Output variables of the \textbf{PointSet} are the ID of each branch of the ET (i.e., positive integers greater than 0) \end{itemize} -Since several ET file formats are available, as of now only the OpenPSA format (see https://open-psa.github.io/joomla1.5/index.php.html) is supported. + +\nb that the 0 or 1 values are specified in the \xmlNode{path state="0"} or \xmlNode{path state="1"} nodes in the ET OpenPSA file. + +Provided this definition, the ET described in Listing~\ref{lst:ETModel} will be converted to \textbf{PointSet} that is characterized +by the following variables: +\begin{itemize} + \item Input variables: statusACC, statusLPI, statusLPR + \item Output variable: sequence +\end{itemize} +and the corresponding \textbf{PointSet} if the \xmlNode{expand} node is set to False is shown in Table~\ref{PointSetETExpandFalse}. +If \xmlNode{expand} set to True, the corresponding \textbf{PointSet} is shown in Table~\ref{PointSetETExpandTrue}. +\begin{table}[h] + \centering + \caption{PointSet generated by RAVEN by employing the ET Importer Post-Processor with \xmlNode{expand} + set to False for the ET of Listing~\ref{lst:ETModel}.} + \label{PointSetETExpandFalse} + \begin{tabular}{c | c | c | c} + \hline + ACC & LPI & LPR & sequence \\ + \hline + 0. & 0. & 0. & 1. \\ + 0. & 0. & 1. & 2. \\ + 0. & 1. & -1. & 3. \\ + 1. & -1. & -1. & 4. \\ + \hline + \end{tabular} +\end{table} +\begin{table}[h] + \centering + \caption{PointSet generated by RAVEN by employing the ET Importer Post-Processor with \xmlNode{expand} + set to True for the ET of Listing~\ref{lst:ETModel}.} + \label{PointSetETExpandTrue} + \begin{tabular}{c | c | c | c} + \hline + ACC & LPI & LPR & sequence \\ + \hline + 0. & 0. & 0. & 1. \\ + 0. & 0. & 1. & 2. \\ + 0. & 1. & 0. & 3. \\ + 0. & 1. & 1. & 3. \\ + 1. & 0. & 0. & 4. \\ + 1. & 0. & 1. & 4. \\ + 1. & 1. & 0. & 4. \\ + 1. & 1. & 1. & 4. \\ + \hline + \end{tabular} +\end{table} + The ETImporter PP supports also: \begin{itemize} \item links to sub-trees + \nb If the ET is split in two or more ETs (and thus one file for each ET), then it is only required to list + all files in the Step. RAVEN automatically detect links among ETs and merge all of them into a single PointSet. \item by-pass branches \item symbolic definition of outcomes: typically outcomes are defined as either 0 (upper branch) or 1 (lower branch). If instead the ET uses the - success/failure labels, then they are converted into 0/1 labels + \textbf{success/failure} labels, then they are converted into 0/1 labels + \nb If the branching condition is not binary or \textbf{success/failure}, then the ET Importer Post-Processor just follows + the numerical value of the \xmlNode{state} attribute of the \xmlNode{} node in the ET OpenPSA file. \item symbolic/numerical definition of sequences: if the ET contains a symbolic sequence then a .xml file is generated. This file contains the mapping between the sequences defined in the ET and the numerical IDs created by RAVEN. The file name is the concatenation of the ET name and "\_mapping". As an example the file "eventTree\_mapping.xml" generated by RAVEN: @@ -1790,6 +1883,8 @@ \subsubsection{ETImporter} \end{lstlisting} contains the mapping of four sequences defined in the ET (seq\_1,seq\_2,seq\_3,seq\_4) with the IDs generated by RAVEN (0,1,2,3). Note that if the sequences defined in the ET are both numerical and symbolic then they are all mapped. + \item The ET can contain a branch that is defined as a separate block in the \xmlNode{define-branch} node and it is + replicated in the ET; in such case RAVEN automatically replicate such branch when generating the PointSet. \end{itemize} The \xmlNode{collect-formula} are not considered since this node is used to connect the Boolean formulae generated by the Fault-Trees to the branch (i.e., fork) point. @@ -1799,33 +1894,220 @@ \subsubsection{ETImporter} % \begin{itemize} \item \xmlNode{fileFormat}, \xmlDesc{string, required field}, specifies the format of the file that contains the ET structure (supported format: OpenPSA). + \item \xmlNode{expand},\xmlDesc{bool, required parameter}, expand the ET branching conditions for all branches even if they are not queried \end{itemize} \textbf{Example:} - -\begin{lstlisting}[style=XML] - - ... +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=ET Importer input example., label=lst:ET_PP_InputExample] + + eventTree.xml + + ... - + OpenPSA - - ... + False + + ... - ... + ... eventTreeTest - ETImporter + ETimporter ET_PS ... - + + + + ... + + ACC,LPI,LPR + sequence + + ... + \end{lstlisting} +%%%%%%%%%%%%%% FTImporter PP %%%%%%%%%%%%%%%%%%% + +\subsubsection{FTImporter} +\label{FTImporterPP} +The \textbf{FTImporter} post-processor has been designed to import Fault-Tree (FT) object into +RAVEN. Since several FT file formats are available, as of now only the OpenPSA format +(see https://open-psa.github.io/joomla1.5/index.php.html) is supported. As an example, +the FT in OpenPSA format is shown in Listing~\ref{lst:FTModel}. + +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=FT in OpenPSA format., label=lst:FTModel] + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +\end{lstlisting} + +This is performed by saving the structure of the FT (from file) as a \textbf{PointSet} +(only \textbf{PointSet} are allowed). + +Each Point in the PointSet represents a unique combination of the basic events. +The PointSet is structured as follows: input variables are the basic events, output variable is the top event of the FT. +The value for each input and output variable can have the following values: +\begin{itemize} + \item 0: False + \item 1: True +\end{itemize} + +Provided this definition, the FT model of Listing~\ref{lst:FTModel} can be converted to +\textbf{PointSet} that is characterized by these variables: +\begin{itemize} + \item Input variables: BE1, BE2, BE3, BE4 + \item Output variable: out +\end{itemize} +and it is structured is shown in Table~\ref{PointSetFT}. + +\begin{table}[h] + \centering + \caption{PointSet generated by RAVEN by employing the FT Importer Post-Processor for the FT of Listing~\ref{lst:FTModel}.} + \label{PointSetFT} + \begin{tabular}{c | c | c | c | c} + \hline + BE1 & BE2 & BE3 & BE4 & TOP \\ + \hline + 0. & 0. & 0. & 0. & 0. \\ + 0. & 0. & 0. & 1. & 0. \\ + 0. & 0. & 1. & 0. & 0. \\ + 0. & 0. & 1. & 1. & 0. \\ + 0. & 1. & 0. & 0. & 0. \\ + 0. & 1. & 0. & 1. & 0. \\ + 0. & 1. & 1. & 0. & 0. \\ + 0. & 1. & 1. & 1. & 0. \\ + 1. & 0. & 0. & 0. & 0. \\ + 1. & 0. & 0. & 1. & 1. \\ + 1. & 0. & 1. & 0. & 1. \\ + 1. & 0. & 1. & 1. & 1. \\ + 1. & 1. & 0. & 0. & 1. \\ + 1. & 1. & 0. & 1. & 1. \\ + 1. & 1. & 1. & 0. & 1. \\ + 1. & 1. & 1. & 1. & 1. \\ + \hline + \end{tabular} +\end{table} + +% +\ppType{FTImporter}{FTImporter} +% +\begin{itemize} + \item \xmlNode{fileFormat}, \xmlDesc{string, required field}, specifies the format of the file that contains the + FT structure (supported format: OpenPSA). + \item \xmlNode{topEventID},\xmlDesc{string, required parameter}, the name of the top event of the FT +\end{itemize} + +The example of FTImporter PostProcessor is shown in Listing~\ref{lst:FT_PP_InputExample} +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=FT Importer input example., label=lst:FT_PP_InputExample] + + FTimporter_not.xml + + + + ... + + OpenPSA + TOP + + ... + + + + ... + + faultTreeTest + FTimporter + FT_PS + + ... + + + + ... + + BE1,BE2,BE3,BE4 + TOP + + ... + +\end{lstlisting} + +Important notes and capabilities: +\begin{itemize} + \item If the FT is split in two or more FTs (and thus one file for each FT), then it is only required to list + all files in the Step. RAVEN automatically detect links among FTs and merge all of them into a single PointSet. + \item Allowed gates: AND, OR, NOT, ATLEAST, CARDINALITY, IFF, imply, NAND, NOR, XOR + \item If an house-event is defined in the FT: +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=FT Importer input example: house-event., label=lst:FT_house event] + + + + + + + + + + + + + + +\end{lstlisting} + then the HE1 is not part of the PointSet (value is fixed) +\end{itemize} + %%%%%%%%%%%%%% Metric PP %%%%%%%%%%%%%%%%%%% \subsubsection{Metric} @@ -2149,6 +2431,117 @@ \subsubsection{CrossValidation} \end{itemize} +%%%%%%%%%%%%%% Data Classifier PP %%%%%%%%%%%%%%%%%%% + +\subsubsection{DataClassifier} +\label{DataClassifierPP} +The \textbf{DataClassifier} post-processor is specifically used to classify the data stored in the DataObjects. It +accepts two DataObjects, one is used as the classifier which must be a \textbf{PointSet}, the other one, +i.e. \textbf{PointSet} or \textbf{HistorySet}, is used as the input DataObject to be classified. +% +\ppType{DataClassifier}{DataClassifier} +% +\begin{itemize} + \item \xmlNode{label}, \xmlDesc{string, required field}, the name of the label that are used for the classifier. This + label must exist in the DataObject that is used as the classifer. This name will also be used as + the label name for the DataObject that is classified. + \item \xmlNode{variable}, \xmlDesc{required, xml node}. In this node, the following attribute should be specified: + \begin{itemize} + \item \xmlAttr{name}, \xmlDesc{required, string attribute}, the variable name, which should be exist in + the DataObject that is used as classifier. + \end{itemize} + and the following sub-node should also be specified: + \begin{itemize} + \item \xmlNode{Function}, \xmlDesc{string, required field}, this function creates the mapping from input DataObject + to the Classifier. + \begin{itemize} + \item \xmlAttr{class}, \xmlDesc{string, required field}, the class of this function (e.g. Functions) + \item \xmlAttr{type}, \xmlDesc{string, required field}, the type of this function (e.g. external) + \end{itemize} + \end{itemize} +\end{itemize} +% +In order to use this post-processor, the users need to specify two different DataObjects, i.e. +\begin{lstlisting}[style=XML] + + + ACC, LPI + sequence + + + ACC_status, LPI_status + out + + +\end{lstlisting} +The first data object ``ET\_PS'' contains the event tree with input variables ``ACC, LPI'' and output label ``sequence''. +This data object will be used to classify the data in the second data object ``sim\_PS''. The results will be stored in +the output data object with the same label ``sequence''. Since these two data objects contain different inputs, +\xmlNode{Functions} will be used to create the maps between the inputs: +\begin{lstlisting}[style=XML] + + + ACC_status + + + LPI_status + + +\end{lstlisting} + +The inputs to these functions are the inputs of the data object that will be classified, and the outputs of these functions +are the inputs of data object that is used as the classifier. + +\textbf{Example Python Function for ``func\_ACC.py''} +\begin{lstlisting}[language=python] +def evaluate(self): + return self.ACC_status +\end{lstlisting} + +\textbf{Example Python Function for ``func\_LPI.py''} +\begin{lstlisting}[language=python] +def evaluate(self): + return self.LPI_status +\end{lstlisting} + +\nb All the functions that are used to create the maps should be include the ``evaluate'' method. + +The \textbf{DataClassifier} post-processor is specifically used to classify the data stored in the DataObjects. It +accepts two DataObjects, one is used as the classifier which should be always \textbf{PointSet}, the other one, i.e. +either \textbf{PointSet} or \textbf{HistorySet} is used as the input DataObject to be classified. + +The \textbf{DataClassifier} is provided below: +\begin{lstlisting}[style=XML] + + + + func_ACC + + + func_LPI + + +\end{lstlisting} +The definitions for the XML nodes can be found in the RAVEN user manual. The label ``sequence'' +and the variables ``ACC, LPI'' should be exist in the data object that is used as the classifier, +while the functions ``func\_ACC, func\_LPI'' are used to map relationships between the input data objects. + +The classification can be achieved via the \xmlNode{Steps} as shown below: +\begin{lstlisting}[style=XML] + + ... + + + ET_PS + sim_PS + ET_Classifier + sim_PS + + + ... + +\end{lstlisting} + %%%%%%%%%%%%%% ValueDuration %%%%%%%%%%%%%%%%%%% \subsubsection{ValueDuration} \label{ValueDurationPP} diff --git a/framework/DataObjects/DataSet.py b/framework/DataObjects/DataSet.py index eb414fec98..4f1e5ec218 100644 --- a/framework/DataObjects/DataSet.py +++ b/framework/DataObjects/DataSet.py @@ -247,6 +247,13 @@ def addVariable(self,varName,values,classify='meta'): self._inputs.append(varName) elif classify == 'output': self._outputs.append(varName) + if type(values[0]) == xr.DataArray: + indexes = values[0].sizes.keys() + for index in indexes: + if index in self._pivotParams.keys(): + self._pivotParams[index].append(varName) + else: + self._pivotParams[index]=[varName] else: self._metavars.append(varName) self._orderedVars.append(varName) diff --git a/framework/Distributions.py b/framework/Distributions.py index c733c8a3d4..05b5584da8 100644 --- a/framework/Distributions.py +++ b/framework/Distributions.py @@ -33,7 +33,10 @@ from collections import OrderedDict import csv from scipy.interpolate import UnivariateSpline +from numpy import linalg as LA +import copy import math as math + #External Modules End-------------------------------------------------------------------------------- #Internal Modules------------------------------------------------------------------------------------ @@ -69,6 +72,7 @@ def factorial(x): 'Custom1D':'Custom1DDistribution', 'Exponential':'ExponentialDistribution', 'Categorical':'Categorical', + 'MarkovCategorical':'MarkovCategorical', 'LogNormal':'LogNormalDistribution', 'Weibull':'WeibullDistribution', 'NDInverseWeight': 'NDInverseWeightDistribution', @@ -1704,6 +1708,133 @@ def rvs(self): DistributionsCollection.addSub(Categorical.getInputSpecification()) +class MarkovCategorical(Categorical): + """ + Class for the Markov categorical distribution based on "Markov Model" + Note: this distribution can have only numerical (float) outcome; in the future we might want to include also the possibility to give symbolic outcome + """ + + @classmethod + def getInputSpecification(cls): + """ + Method to get a reference to a class that specifies the input data for + class cls. + @ In, cls, the class for which we are retrieving the specification + @ Out, inputSpecification, InputData.ParameterInput, class to use for + specifying input of cls. + """ + inputSpecification = InputData.parameterInputFactory(cls.__name__, ordered=True, baseNode=None) + + StatePartInput = InputData.parameterInputFactory("state", contentType=InputData.StringType) + StatePartInput.addParam("outcome", InputData.FloatType, True) + StatePartInput.addParam("index", InputData.IntegerType, True) + TransitionInput = InputData.parameterInputFactory("transition", contentType=InputData.StringType) + inputSpecification.addSub(StatePartInput, InputData.Quantity.one_to_infinity) + inputSpecification.addSub(TransitionInput, InputData.Quantity.zero_to_one) + inputSpecification.addSub(InputData.parameterInputFactory("workingDir", contentType=InputData.StringType)) + ## Because we do not inherit from the base class, we need to manually + ## add the name back in. + inputSpecification.addParam("name", InputData.StringType, True) + + return inputSpecification + + def __init__(self): + """ + Function that initializes the categorical distribution + @ In, None + @ Out, none + """ + Categorical.__init__(self) + self.dimensionality = 1 + self.disttype = 'Discrete' + self.type = 'MarkovCategorical' + self.steadyStatePb = None # variable containing the steady state probabilities of the Markov Model + self.transition = None # transition matrix of a continuous time Markov Model + + def _handleInput(self, paramInput): + """ + Function to handle the common parts of the distribution parameter input. + @ In, paramInput, ParameterInput, the already parsed input. + @ Out, None + """ + workingDir = paramInput.findFirst('workingDir') + if workingDir is not None: + self.workingDir = workingDir.value + else: + self.workingDir = os.getcwd() + + for child in paramInput.subparts: + if child.getName() == "state": + outcome = child.parameterValues["outcome"] + markovIndex = child.parameterValues["index"] + self.mapping[outcome] = markovIndex + if outcome in self.values: + self.raiseAnError(IOError,'Markov Categorical distribution has identical outcomes') + else: + self.values.add(outcome) + elif child.getName() == "transition": + transition = [float(value) for value in child.value.split()] + dim = int(np.sqrt(len(transition))) + if dim == 1: + self.raiseAnError(IOError, "The dimension of transition matrix should be greater than 1!") + elif dim**2 != len(transition): + self.raiseAnError(IOError, "The transition matrix is not a square matrix!") + self.transition = np.asarray(transition).reshape((-1,dim)) + #Check the correctness of user inputs + invalid = self.transition is None + if invalid: + self.raiseAnError(IOError, "Transition matrix is not provided, please use 'transition' node to provide the transition matrix!") + if len(self.mapping.values()) != len(set(self.mapping.values())): + self.raiseAnError(IOError, "The states of Markov Categorical distribution have identifcal indices!") + + self.initializeDistribution() + + def getInitParams(self): + """ + Function to get the initial values of the input parameters that belong to + this class + @ In, None + @ Out, paramDict, dict, dictionary containing the parameter names as keys + and each parameter's initial value as the dictionary values + """ + paramDict = Distribution.getInitParams(self) + paramDict['mapping'] = self.mapping + paramDict['values'] = self.values + paramDict['transition'] = self.transition + paramDict['steadyStatePb'] = self.steadyStatePb + return paramDict + + def initializeDistribution(self): + """ + Function that initializes the distribution and checks that the sum of all state probabilities is equal to 1 + @ In, None + @ Out, None + """ + self.steadyStatePb = self.computeSteadyStatePb(self.transition) + for key, value in self.mapping.items(): + try: + self.mapping[key] = self.steadyStatePb[value - 1] + except IndexError: + self.raiseAnError(IOError, "Index ",value, " for outcome ", key, " is out of bounds! Maximum index should be ", len(self.steadyStatePb)) + Categorical.initializeDistribution(self) + + def computeSteadyStatePb(self, transition): + """ + Function that compute the steady state probabilities for given transition matrix + @ In, transition, numpy.array, transition matrix for Markov model + @ Out, steadyStatePb, numpy.array, 1-D array of steady state probabilities + """ + dim = transition.shape[0] + perturbTransition = copy.copy(transition) + perturbTransition[0] = 1 + q = np.zeros(dim) + q[0] = 1 + steadyStatePb = np.dot(LA.inv(perturbTransition),q) + + return steadyStatePb + +DistributionsCollection.addSub(MarkovCategorical.getInputSpecification()) + class Logistic(BoostDistribution): """ Logistic univariate distribution @@ -3493,6 +3624,7 @@ def rvs(self,*args): __interFaceDict['Binomial' ] = Binomial __interFaceDict['Bernoulli' ] = Bernoulli __interFaceDict['Categorical' ] = Categorical +__interFaceDict['MarkovCategorical' ] = MarkovCategorical __interFaceDict['Logistic' ] = Logistic __interFaceDict['Exponential' ] = Exponential __interFaceDict['LogNormal' ] = LogNormal diff --git a/framework/Models/Code.py b/framework/Models/Code.py index 6ff8ed3733..1bbd4afbf0 100644 --- a/framework/Models/Code.py +++ b/framework/Models/Code.py @@ -27,6 +27,7 @@ import importlib import platform import shlex +import time import numpy as np #External Modules End-------------------------------------------------------------------------------- @@ -58,6 +59,7 @@ class cls. inputSpecification = super(Code, cls).getInputSpecification() inputSpecification.setStrictMode(False) #Code interfaces can allow new elements. inputSpecification.addSub(InputData.parameterInputFactory("executable", contentType=InputData.StringType)) + inputSpecification.addSub(InputData.parameterInputFactory("walltime", contentType=InputData.FloatType)) inputSpecification.addSub(InputData.parameterInputFactory("preexec", contentType=InputData.StringType)) ## Begin command line arguments tag @@ -115,6 +117,7 @@ def __init__(self,runInfoDict): self.codeFlags = None #flags that need to be passed into code interfaces(if present) self.printTag = 'CODE MODEL' self.createWorkingDir = True + self.maxWallTime = None def _readMoreXML(self,xmlNode): """ @@ -130,7 +133,9 @@ def _readMoreXML(self,xmlNode): self.fargs={'input':{}, 'output':'', 'moosevpp':''} for child in paramInput.subparts: if child.getName() =='executable': - self.executable = str(child.value) + self.executable = child.value + if child.getName() =='walltime': + self.maxWallTime = child.value if child.getName() =='preexec': self.preExec = child.value elif child.getName() == 'clargs': @@ -201,13 +206,16 @@ def _readMoreXML(self,xmlNode): if self.executable == '': self.raiseAWarning('The node "" was not found in the body of the code model '+str(self.name)+' so no code will be run...') else: - if '~' in self.executable: - self.executable = os.path.expanduser(self.executable) - abspath = os.path.abspath(str(self.executable)) - if os.path.exists(abspath): - self.executable = abspath + if os.environ.get('RAVENinterfaceCheck','False').lower() in utils.stringsThatMeanFalse(): + if '~' in self.executable: + self.executable = os.path.expanduser(self.executable) + abspath = os.path.abspath(str(self.executable)) + if os.path.exists(abspath): + self.executable = abspath + else: + self.raiseAMessage('not found executable '+self.executable,'ExceptedError') else: - self.raiseAMessage('not found executable '+self.executable,'ExceptedError') + self.executable = '' if self.preExec is not None: if '~' in self.preExec: self.preExec = os.path.expanduser(self.preExec) @@ -488,7 +496,20 @@ def evaluateSample(self, myInput, samplerType, kwargs): ## This code should be evaluated by the job handler, so it is fine to wait ## until the execution of the external subprocess completes. process = utils.pickleSafeSubprocessPopen(command, shell=True, stdout=outFileObject, stderr=outFileObject, cwd=localenv['PWD'], env=localenv) - process.wait() + + if self.maxWallTime is not None: + timeout = time.time() + self.maxWallTime + while True: + time.sleep(0.5) + process.poll() + if time.time() > timeout and process.returncode is None: + self.raiseAWarning('walltime exeeded in run in working dir: '+str(metaData['subDirectory'])+'. Killing the run...') + process.kill() + process.returncode = -1 + if process.returncode is not None or time.time() > timeout: + break + else: + process.wait() returnCode = process.returncode # procOutput = process.communicate()[0] @@ -597,8 +618,6 @@ def evaluateSample(self, myInput, samplerType, kwargs): ## Check if the user specified any file extensions for clean up for fileExt in fileExtensionsToDelete: - if not fileExt.startswith("."): - fileExt = "." + fileExt fileList = [ os.path.join(metaData['subDirectory'],f) for f in os.listdir(metaData['subDirectory']) if f.endswith(fileExt) ] for f in fileList: os.remove(f) diff --git a/framework/PostProcessors/DataClassifier.py b/framework/PostProcessors/DataClassifier.py new file mode 100644 index 0000000000..da2bfe6a03 --- /dev/null +++ b/framework/PostProcessors/DataClassifier.py @@ -0,0 +1,305 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on Jan 29, 2018 + +@author: Congjian Wang +""" +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#External Modules--------------------------------------------------------------- +import copy +import xarray as xr +import numpy as np +#External Modules End----------------------------------------------------------- + +#Internal Modules--------------------------------------------------------------- +from BaseClasses import BaseType +from utils import InputData +from .PostProcessor import PostProcessor +import MessageHandler +import Files +#Internal Modules End----------------------------------------------------------- + +class DataClassifier(PostProcessor): + """ + This Post-Processor performs data classification based on given classifier. + In order to use this interface post-processor, the users need to provide + two data objects, one (only PointSet is allowed) is used to construct the + classifier that will be used to label the data in the second data object + (both PointSet and HistorySet are allowed). + """ + + @classmethod + def getInputSpecification(cls): + """ + Method to get a reference to a class that specifies the input data for + class cls. + @ In, cls, the class for which we are retrieving the specification + @ Out, inputSpecification, InputData.ParameterInput, class to use for + specifying input of cls. + """ + inputSpecification = super(DataClassifier, cls).getInputSpecification() + VariableInput = InputData.parameterInputFactory("variable", contentType=InputData.StringType) + VariableInput.addParam("name", InputData.StringType, True) + FunctionInput = InputData.parameterInputFactory("Function", contentType=InputData.StringType) + FunctionInput.addParam("class", InputData.StringType, True) + FunctionInput.addParam("type", InputData.StringType, True) + VariableInput.addSub(FunctionInput, InputData.Quantity.one) + LabelInput = InputData.parameterInputFactory("label",contentType=InputData.StringType) + inputSpecification.addSub(VariableInput, InputData.Quantity.one_to_infinity) + inputSpecification.addSub(LabelInput, InputData.Quantity.one) + + return inputSpecification + + def __init__(self, messageHandler): + """ + Constructor + @ In, messageHandler, MessageHandler, message handler object + @ Out, None + """ + PostProcessor.__init__(self, messageHandler) + self.printTag = 'POSTPROCESSOR DataClassifier' + self.mapping = {} # dictionary for mapping input space between different DataObjects {'variableName':'externalFunctionName'} + self.funcDict = {} # Contains the function to be used {'variableName':externalFunctionInstance} + self.label = None # ID of the variable which containf the label values + + def _localGenerateAssembler(self, initDict): + """ + It is used for sending to the instanciated class, which is implementing the method, the objects that have + been requested throught "whatDoINeed" method + @ In, initDict, dict, dictionary ({'mainClassName:{specializedObjectName:ObjectInstance}'}) + @ Out, None + """ + availableFunc = initDict['Functions'] + for key, val in self.mapping.items(): + if val[1] not in availableFunc.keys(): + self.raiseAnError(IOError, 'Function ', val[1], ' was not found among the available functions: ', availableFunc.keys()) + self.funcDict[key] = availableFunc[val[1]] + # check if the correct method is present + if 'evaluate' not in self.funcDict[key].availableMethods(): + self.raiseAnError(IOError, 'Function ', val[1], ' does not contain a method named "evaluate". ' + 'It mush be present if this needs to be used by other RAVEN entities!') + + def _localWhatDoINeed(self): + """ + This method is a local mirror of the general whatDoINeed method that need to request + special objects, e.g. Functions + @ In, None + @ Out, needDict, dict, dictionary of objects needed + """ + needDict = {} + needDict['Functions'] = [] + for func in self.mapping.values(): + needDict['Functions'].append(func) + return needDict + + def initialize(self, runInfo, inputs, initDict=None): + """ + Method to initialize the DataClassifier post-processor. + @ In, runInfo, dict, dictionary of run info (e.g. working dir, etc) + @ In, inputs, list, list of inputs + @ In, initDict, dict, optional, dictionary with initialization options + @ Out, None + """ + PostProcessor.initialize(self, runInfo, inputs, initDict) + + def _localReadMoreXML(self, xmlNode): + """ + Method to read the portion of the XML input that belongs to this specialized class + @ In, xmlNode, xml.etree.ElementTree.Element, XML element node + @ Out, None + """ + paramInput = DataClassifier.getInputSpecification()() + paramInput.parseNode(xmlNode) + self._handleInput(paramInput) + + def _handleInput(self, paramInput): + """ + Method that handles PostProcessor parameter input block + @ In, paramInput, ParameterInput, the already parsed input + @ Out, None + """ + for child in paramInput.subparts: + if child.getName() == 'variable': + func = child.findFirst('Function') + funcType = func.parameterValues['type'] + funcName = func.value.strip() + self.mapping[child.parameterValues['name']] = (funcType, funcName) + elif child.getName() == 'label': + self.label = child.value.strip() + + def inputToInternal(self, currentInput): + """ + Method to convert a list of input objects into the internal format that is + understandable by this pp. + @ In, currentInput, list, a list of DataObjects + @ Out, newInput, list, list of converted data + """ + if isinstance(currentInput,list) and len(currentInput) != 2: + self.raiseAnError(IOError, "Two inputs DataObjects are required for postprocessor", self.name) + newInput ={'classifier':{}, 'target':{}} + haveClassifier = False + haveTarget = False + for inputObject in currentInput: + if isinstance(inputObject, dict): + newInput.append(inputObject) + else: + if inputObject.type not in ['PointSet', 'HistorySet']: + self.raiseAnError(IOError, "The input for this postprocesor", self.name, "is not acceptable! Allowed inputs are 'PointSet' and 'HistorySet'.") + if len(inputObject) == 0: + self.raiseAnError(IOError, "The input", inputObject.name, "is empty!") + inputDataset = inputObject.asDataset() + inputParams = inputObject.getVars('input') + outputParams = inputObject.getVars('output') + dataType = None + mappingKeys = self.mapping.keys() + if len(set(mappingKeys)) != len(mappingKeys): + dups = set([elem for elem in mappingKeys if mappingKeys.count(elem) > 1]) + self.raiseAnError(IOError, "The same variable {} name is used multiple times in the XML input".format(dups[0])) + if set(self.mapping.keys()) == set(inputParams) and self.label in outputParams: + dataType = 'classifier' + if not haveClassifier: + haveClassifier = True + else: + self.raiseAnError(IOError, "Both input data objects have been already processed! No need to execute this postprocessor", self.name) + if inputObject.type != 'PointSet': + self.raiseAnError(IOError, "Only PointSet is allowed as classifier, but HistorySet", inputObject.name, "is provided!") + else: + dataType = 'target' + if not haveTarget: + haveTarget = True + else: + self.raiseAnError(IOError, "None of the input DataObjects can be used as the reference classifier! Either the label", \ + self.label, "is not exist in the output of the DataObjects or the inputs of the DataObjects are not the same as", \ + ','.join(self.mapping.keys())) + newInput[dataType]['input'] = dict.fromkeys(inputParams) + newInput[dataType]['output'] = dict.fromkeys(outputParams) + if inputObject.type == 'PointSet': + for elem in inputParams: + newInput[dataType]['input'][elem] = copy.deepcopy(inputDataset[elem].values) + for elem in outputParams: + newInput[dataType]['output'][elem] = copy.deepcopy(inputDataset[elem].values) + newInput[dataType]['type'] = inputObject.type + newInput[dataType]['name'] = inputObject.name + else: + # only extract the last element in each realization for the HistorySet + newInput[dataType]['type'] = inputObject.type + newInput[dataType]['name'] = inputObject.name + numRlzs = len(inputObject) + newInput[dataType]['historySizes'] = dict.fromkeys(range(numRlzs)) + for i in range(numRlzs): + rlz = inputObject.realization(index=i) + for elem in inputParams: + if newInput[dataType]['input'][elem] is None: + newInput[dataType]['input'][elem] = np.empty(0) + newInput[dataType]['input'][elem] = np.append(newInput[dataType]['input'][elem], rlz[elem]) + for elem in outputParams: + if newInput[dataType]['output'][elem] is None: + newInput[dataType]['output'][elem] = np.empty(0) + newInput[dataType]['output'][elem] = np.append(newInput[dataType]['output'][elem], rlz[elem].values[-1]) + if newInput[dataType]['historySizes'][i] is None: + newInput[dataType]['historySizes'][i] = len(rlz[elem].values) + + return newInput + + def run(self, inputIn): + """ + This method executes the postprocessor action. + @ In, inputIn, list, list of DataObjects + @ Out, outputDict, dict, dictionary of outputs + """ + inputDict = self.inputToInternal(inputIn) + targetDict = inputDict['target'] + classifierDict = inputDict['classifier'] + outputDict = {} + outputType = targetDict['type'] + outputDict['dataType'] = outputType + outputDict['dataFrom'] = targetDict['name'] + if outputType == 'HistorySet': + outputDict['historySizes'] = copy.copy(targetDict['historySizes']) + + numRlz = targetDict['input'].values()[0].size + outputDict[self.label] = np.empty(numRlz) + for i in range(numRlz): + tempTargDict = {} + for param, vals in targetDict['input'].items(): + tempTargDict[param] = vals[i] + for param, vals in targetDict['output'].items(): + tempTargDict[param] = vals[i] + tempClfList = [] + labelIndex = None + for key, values in classifierDict['input'].items(): + calcVal = self.funcDict[key].evaluate("evaluate", tempTargDict) + inds, = np.where(np.asarray(values) == calcVal) + if labelIndex is None: + labelIndex = set(inds) + else: + labelIndex = labelIndex & set(inds) + if len(labelIndex) != 1: + self.raiseAnError(IOError, "The parameters", ",".join(tempTargDict.keys()), "with values", ",".join([str(el) for el in tempTargDict.values()]), "could not be put in any class!") + outputDict[self.label][i] = classifierDict['output'][self.label][list(labelIndex)[0]] + + return outputDict + + def collectOutput(self, finishedJob, output): + """ + Method to place all of the computed data into output object + @ In, finishedJob, object, JobHandler object that is in charge of running this postprocessor + @ In, output, object, the object where we want to place our computed results + @ Out, None + """ + evaluation = finishedJob.getEvaluation() + inputObjects, outputDict = evaluation + + if isinstance(output, Files.File): + self.raiseAnError(IOError, "Dump results to files is not yet implemented!") + + for inp in inputObjects: + if inp.name == outputDict['dataFrom']: + inputObject = inp + break + if inputObject != output: + ## Copy any data you need from the input DataObject before adding new data + rlzs = inputObject.asDataset(outType='dict')['data'] + if output.type == 'PointSet': + output.load(rlzs, style='dict') + elif output.type == 'HistorySet': + if inputObject.type != 'HistorySet': + self.raiseAnError(IOError, "Copying the data from input PointSet", inputObject.name, "to output HistorySet", output.name, "is currently not allowed!") + output.load(rlzs, style='dict', dims=inputObject.getDimensions()) + + if output.type == 'PointSet': + output.addVariable(self.label, copy.copy(outputDict[self.label]), classify='output') + elif output.type == 'HistorySet': + numRlzs = output.size + labelValues = np.zeros(numRlzs, dtype=object) + pivotParams = tuple(output.indexes) + slices = output.sliceByIndex('RAVEN_sample_ID') + coordList = [] + for i in range(numRlzs): + coordDict = {} + for elem in pivotParams: + coordDict[elem] = slices[i].dropna(elem)[elem] + coordList.append(coordDict) + + for i in range(numRlzs): + histSize = outputDict['historySizes'][i] + values = np.empty(histSize) + values.fill(outputDict[self.label][i]) + xrArray = xr.DataArray(values, dims=pivotParams, coords=coordList[i]) + labelValues[i] = xrArray + output.addVariable(self.label, labelValues, classify='output') diff --git a/framework/PostProcessors/ETImporter.py b/framework/PostProcessors/ETImporter.py index 66b0fee4b5..8f062c070c 100644 --- a/framework/PostProcessors/ETImporter.py +++ b/framework/PostProcessors/ETImporter.py @@ -15,7 +15,6 @@ Created on Nov 1, 2017 @author: dan maljovec, mandd - """ from __future__ import division, print_function , unicode_literals, absolute_import @@ -35,12 +34,13 @@ from utils import utils import Files import Runners +from .ETStructure import ETStructure #Internal Modules End----------------------------------------------------------- class ETImporter(PostProcessor): """ - This is the base class of the postprocessor that imports Event-Trees (ETs) into RAVEN as a PointSet + This is the base class of the PostProcessor that imports Event-Trees (ETs) into RAVEN as a PointSet """ def __init__(self, messageHandler): """ @@ -49,9 +49,12 @@ def __init__(self, messageHandler): @ Out, None """ PostProcessor.__init__(self, messageHandler) - self.printTag = 'POSTPROCESSOR ET IMPORTER' - self.ETFormat = None - self.allowedFormats = ['OpenPSA'] + self.printTag = 'POSTPROCESSOR ET IMPORTER' + self.expand = None # option that controls the structure of the ET. If True, the tree is expanded so that + # all possible sequences are generated. Sequence label is maintained according to the + # original tree + self.fileFormat = None # chosen format of the ET file + self.allowedFormats = ['OpenPSA'] # ET formats that are supported @classmethod def getInputSpecification(cls): @@ -64,6 +67,7 @@ class cls. """ inputSpecification = super(ETImporter, cls).getInputSpecification() inputSpecification.addSub(InputData.parameterInputFactory("fileFormat", contentType=InputData.StringType)) + inputSpecification.addSub(InputData.parameterInputFactory("expand" , contentType=InputData.BoolType)) return inputSpecification def initialize(self, runInfo, inputs, initDict) : @@ -97,435 +101,45 @@ def _handleInput(self, paramInput): fileFormat = paramInput.findFirst('fileFormat') self.fileFormat = fileFormat.value if self.fileFormat not in self.allowedFormats: - self.raiseAnError(IOError, - 'ETImporterPostProcessor Post-Processor ' + self.name + ', format ' + str(self.fileFormat) + ' : is not supported') + self.raiseAnError(IOError, 'ETImporterPostProcessor Post-Processor ' + self.name + ', format ' + str(self.fileFormat) + ' : is not supported') - def run(self, inputs): - """ - This method executes the postprocessor action. - @ In, inputs, list, list of file objects - @ Out, None - """ - return self.runOpenPSA(inputs) + expand = paramInput.findFirst('expand') + self.expand = expand.value - def runOpenPSA(self, inputs): + def run(self, inputs): """ - This method executes the postprocessor action. + This method executes the PostProcessor action. @ In, inputs, list, list of file objects @ Out, None """ - ### Check for link to other ET - links = [] - sizes=(len(inputs),len(inputs)) - connectivityMatrix = np.zeros(sizes) - listETs=[] - listRoots=[] - - for file in inputs: - eventTree = ET.parse(file.getPath() + file.getFilename()) - listETs.append(eventTree.getroot().get('name')) - listRoots.append(eventTree.getroot()) - links = self.createLinkList(listRoots) - - if len(inputs)>0: - rootETID = self.checkETstructure(links,listETs,connectivityMatrix) - - if len(links)>=1 and len(inputs)>1: - finalAssembledTree = self.analyzeMultipleET(inputs,links,listRoots,listETs,rootETID) - return self.analyzeSingleET(finalAssembledTree) - - if len(links)==0 and len(inputs)>1: - self.raiseAnError(IOError, 'Multiple ET files have provided but they are not linked') - - if len(links)>1 and len(inputs)==1: - self.raiseAnError(IOError, 'A single ET files has provided but it contains a link to an additional ET') - - if len(links)==0 and len(inputs)==1: - eventTree = ET.parse(inputs[0].getPath() + inputs[0].getFilename()) - return self.analyzeSingleET(eventTree.getroot()) - - def createLinkList(self,listRoots): - """ - This method identifies the links among ETs. It saves such links in the variable links. - Note that this method overwrites such variable when it is called. This is because the identification - of the links needs to be computed from scratch after every merging step since the structure of ETs has changed. - The variable links=[dep1,...,depN] is a list of connections where each connection dep is a - dictionary as follows: - - dep.keys = - * link_seqID : ID of the link in the master ET - * ET_slave_ID : ID of the slave ET that needs to be copied into the master ET - * ET_master_ID: ID of the master ET; - - The slave ET is merged into the master ET; note the master ET contains the link in at least one - node: - - - - - - @ In, listRoots, list, list containing the root of all ETs - @ Out, linkList, list, list containing the links among the ETs - """ - linkList = [] - for root in listRoots: - links, seqID = self.checkLinkedTree(root) - if len(links) > 0: - for idx, val in enumerate(links): - dep = {} - dep['link_seqID'] = copy.deepcopy(seqID[idx]) - dep['ET_slave_ID'] = copy.deepcopy(val) - dep['ET_master_ID'] = copy.deepcopy(root.get('name')) - linkList.append(dep) - return linkList - - def checkETstructure(self,links,listETs,connectivityMatrix): - """ - This method checks that the structure of the ET is consistent. In particular, it checks that only one root ET - and at least one leaf ET is provided. As an example consider the following ET structure: - ET1 ----> ET2 ----> ET3 - |------> ET4 ----> ET5 - Five ETs have been provided, ET1 is the only root ET while ET3 and ET5 are leaf ET. - @ In, listETs, list, list containing the ID of the ETs - @ In, connectivityMatrix, np.array, matrix containing connectivity mapping - @ Out, rootETID, xml.etree.Element, root of the main ET - """ - - # each element (i,j) of the matrix connectivityMatrix shows if there is a connection from ET_i to ET_j: - # * 0: no connection from i to j - # * 1: a connection exists from i to j - for link in links: - row = listETs.index(link['ET_master_ID']) - col = listETs.index(link['ET_slave_ID']) - connectivityMatrix[row,col]=1.0 - - # the root ETs are charaterized by a column full of zeros - # the leaf ETs are charaterized by a row full of zeros - zeroRows = np.where(~connectivityMatrix.any(axis=1))[0] - zeroColumns = np.where(~connectivityMatrix.any(axis=0))[0] - - if len(zeroColumns)>1: - self.raiseAnError(IOError, 'Multiple root ET') - if len(zeroColumns)==0: - self.raiseAnError(IOError, 'No root ET') - if len(zeroColumns)==1: - rootETID = listETs[zeroColumns[0]] - self.raiseADebug("ETImporter Root ET: " + str(rootETID)) - - leafs = [] - for index in np.nditer(zeroRows): - leafs.append(listETs[index]) - self.raiseADebug("ETImporter leaf ETs: " + str(leafs)) - - return rootETID - - def analyzeMultipleET(self,inputs,links,listRoots,listETs,rootETID): - """ - This method executes the analysis of the ET if multiple ETs are provided. It merge all ETs onto the root ET - @ In, input, list, list of file objects - @ In, links, list, list containing the links among the ETs - @ In, listRoots, list containing the root of all ETs - @ In, listETs, list, list containing the ID of the ETs - @ In, rootETID, xml.etree.Element, root of the main ET - @ Out, xmlNode, xml.etree.Element, root of the assembled root ET - """ - # 1. for all ET check if it contains SubBranches - ETset = [] - for fileID in inputs: - eventTree = ET.parse(fileID.getPath() + fileID.getFilename()) - root = self.checkSubBranches(eventTree.getroot()) - ETset.append(root) - - # 2. loop on the dependencies until it is empty - while len(links)>0: - for link in links: - indexMaster = listETs.index(link['ET_master_ID']) - indexSlave = listETs.index(link['ET_slave_ID']) - mergedTree = self.mergeLinkedTrees(listRoots[indexMaster],listRoots[indexSlave],link['link_seqID']) - - listETs.pop(indexMaster) - listRoots.pop(indexMaster) - - listETs.append(link['ET_master_ID']) - listRoots.append(mergedTree) - - links = self.createLinkList(listRoots) - - indexRootET = listETs.index(rootETID) - return listRoots[indexRootET] - - def analyzeSingleET(self,masterRoot): - """ - This method executes the analysis of the ET if a single ET is provided. - @ In, masterRoot, xml.etree.Element, root of the ET - @ Out, outputDict, dict, dictionary containing the pointSet data - """ - root = self.checkSubBranches(masterRoot) - - ## These outcomes will be encoded as integers starting at 0 - outcomes = [] - ## These variables will be mapped into an array where there index - variables = [] - values = {} - for node in root.findall('define-functional-event'): - event = node.get('name') - ## First, map the variable to an index by placing it in a list - variables.append(event) - ## Also, initialize the dictionary of values for this variable so we can - ## encode them as integers as well - values[event] = [] - ## Iterate through the forks that use this event and gather all of the - ## possible states - for fork in self.findAllRecursive(root.find('initial-state'), 'fork'): - if fork.get('functional-event') == event: - for path in fork.findall('path'): - state = path.get('state') - if state not in values[event]: - values[event].append(state) - - ## Iterate through the sequences and gather all of the possible outcomes - ## so we can numerically encode them latter - for node in root.findall('define-sequence'): - outcome = node.get('name') - if outcome not in outcomes: - outcomes.append(outcome) - etMap = self.returnMap(outcomes, root.get('name')) - - self.raiseADebug("ETImporter variables identified: " + str(format(variables))) - - d = len(variables) - n = len(self.findAllRecursive(root.find('initial-state'), 'sequence')) - pointSet = -1 * np.ones((n, d + 1)) - rowCounter = 0 - for node in root.find('initial-state'): - newRows = self.constructPointDFS(node, variables, values, etMap, pointSet, rowCounter) - rowCounter += newRows - outputDict = {} - #outputDict['inputs'] = {} - #outputDict['outputs'] = {} - for index, var in enumerate(variables): - outputDict[var] = pointSet[:, index] - outputDict['sequence'] = pointSet[:, -1] - - return outputDict, variables - - def checkLinkedTree(self, root): - """ - This method checks if the provided root of the ET contains links to other ETs. - This occurs if a node contains a sub-node: - - - - The name of the is the link to the ET defined as follows: - - - @ In, root, xml.etree.Element, root of the root ET - @ Out, dependencies, list, ID of the linked ET (e.g., Link-to-LP-Event-Tree) - @ Out, seqID, list, ID of the link in the root ET (e.g., Link-to-LP) - """ - dependencies = [] - seqID = [] - - for node in root.findall('define-sequence'): - for child in node.getiterator(): - if 'event-tree' in child.tag: - dependencies.append(child.get('name')) - seqID.append(node.get('name')) - return dependencies, seqID - - def mergeLinkedTrees(self,rootMaster,rootSlave,location): - """ - This method merged two ET; it merges a slave ET onto the master ET. Note that slave ET can be copied - in multiple branches of the master ET. - @ In, rootMaster, xml.etree.Element, root of the master ET - @ In, rootSlave, xml.etree.Element, root of the slave ET - @ In, location, string, ID of the link that identifies the branches of the master ET that are linked to the slave ET - @ Out, rootMaster, xml.etree.Element, root of the master ET after the merging process has completed - """ - # 1. copy define-functional-event block - for node in rootSlave.findall('define-functional-event'): - rootMaster.append(node) - # 2. copy define-sequence block - for node in rootSlave.findall('define-sequence'): - rootMaster.append(node) - # 3. remove the that points at the "location" - for node in rootMaster.findall('define-sequence'): - if node.get('name') == location: - rootMaster.remove(node) - # 4. copy slave ET into master ET - for node in rootMaster.findall('.//'): - if node.tag == 'path': - for subNode in node.findall('sequence'): - linkName = subNode.get('name') - if linkName == location: - node.append(rootSlave.find('initial-state').find('fork')) - node.remove(subNode) - return copy.deepcopy(rootMaster) - - def checkSubBranches(self,root): - """ - This method checks if the provided ET contains sub-branches (i.e., by-pass). - This occurs if the node is provided. - As an example: - - defines a branch that is linked to multiple ET sequences: - - - - The scope of this method is to copy the into the sequences of the - ET that are linked to - @ In, root, xml.etree.Element, root of the ET - @ Out, root, xml.etree.Element, root of the processed ET - """ - eventTree = root.findall('initial-state') - - if len(eventTree) > 1: - self.raiseAnError(IOError, 'ETImporter: more than one initial-state identified') - ### Check for sub-branches - subBranches = {} - for node in root.findall('define-branch'): - subBranches[node.get('name')] = node.find('fork') - self.raiseADebug("ETImporter branch identified: " + str(node.get('name'))) - if len(subBranches) > 0: - for node in root.findall('.//'): - if node.tag == 'path': - for subNode in node.findall('branch'): - linkName = subNode.get('name') - if linkName in subBranches.keys(): - node.append(subBranches[linkName]) - else: - self.raiseAnError(RuntimeError, ' ETImporter: branch ' + str( - linkName) + ' linked in the ET is not defined; available branches are: ' + str( - subBranches.keys())) - - for child in root: - if child.tag == 'branch': - root.remove(child) - - return root - - def returnMap(self,outcomes,name): - """ - This method returns a map if the ET contains symbolic sequences. - This is needed since since RAVEN requires numeric values for sequences. - @ In, outcomes, list, list that contains all the sequences IDs provided in the ET - @ In, name, string, name of the ET - @ Out, etMap, dict, dictionary containing the map - """ - # check if outputMap contains string ID for at least one sequence - # if outputMap contains all numbers then keep the number ID - allFloat = True - for seq in outcomes: - try: - float(seq) - except ValueError: - allFloat = False - break - etMap = {} - if allFloat == False: - # create an integer map, and create an integer map file - root = ET.Element('map') - root.set('Tree', name) - for seq in outcomes: - etMap[seq] = outcomes.index(seq) - ET.SubElement(root, "sequence", ID=str(outcomes.index(seq))).text = str(seq) - fileID = name + '_mapping.xml' - updatedTreeMap = ET.ElementTree(root) - xmlU.prettify(updatedTreeMap) - updatedTreeMap.write(fileID) - else: - for seq in outcomes: - etMap[seq] = utils.floatConversion(seq) - return etMap + eventTreeModel = ETStructure(self.expand, inputs) + return eventTreeModel.returnDict() def collectOutput(self, finishedJob, output): """ Function to place all of the computed data into the output object, (DataObjects) - @ In, finishedJob, object, JobHandler object that is in charge of running this postprocessor + @ In, finishedJob, object, JobHandler object that is in charge of running this PostProcessor @ In, output, object, the object where we want to place our computed results @ Out, None """ evaluation = finishedJob.getEvaluation() - outputDict, variables = evaluation[1] + outputDict ={} + outputDict['data'], variables = evaluation[1] if isinstance(evaluation, Runners.Error): self.raiseAnError(RuntimeError, ' No available output to collect (Run probably is not finished yet) via',self.printTag) if not set(output.getVars('input')) == set(variables): self.raiseAnError(RuntimeError, ' ETImporter: set of branching variables in the ' - 'ET ( ' + str(output.getParaKeys('inputs')) + ' ) is not identical to the' - ' set of input variables specified in the PointSet (' + str(variables) +')') + 'ET ( ' + str(variables) + ' ) is not identical to the' + ' set of input variables specified in the PointSet (' + str(output.getParaKeys('inputs')) +')') # Output to file - if set(outputDict.keys()) != set(output.getVars()): + if set(outputDict['data'].keys()) != set(output.getVars(subset='input')+output.getVars(subset='output')): self.raiseAnError(RuntimeError, 'ETImporter failed: set of variables specified in the output ' - 'dataObject (' + str(set(output.getVars())) + ') is different form the set of ' - 'variables specified in the ET (' + str(set(outputDict.keys()))) + 'dataObject (' + str(set(outputDict['data'].keys())) + ') is different from the set of ' + 'variables specified in the ET (' + str(set(output.getVars(subset='input')+output.getVars(subset='output')))) if output.type in ['PointSet']: - output.load(outputDict,style='dict') + outputDict['dims'] = {} + for key in outputDict.keys(): + outputDict['dims'][key] = [] + output.load(outputDict['data'], style='dict', dims=outputDict['dims']) else: self.raiseAnError(RuntimeError, 'ETImporter failed: Output type ' + str(output.type) + ' is not supported.') - - def findAllRecursive(self, node, element): - """ - A function for recursively traversing a node in an elementTree to find - all instances of a tag. - Note that this method differs from findall() since it goes for all nodes, - subnodes, subsubnodes etc. recursively - @ In, node, ET.Element, the current node to search under - @ In, element, str, the string name of the tags to locate - @ InOut, result, list, a list of the currently recovered results - """ - result=[] - for elem in node.iter(tag=element): - result.append(elem) - return result - - def constructPointDFS(self, node, inputMap, stateMap, outputMap, X, rowCounter): - """ - Construct a "sequence" using a depth-first search on a node, each call - will be on a fork except in the base case which will be called on a - sequence node. The recursive case will traverse into a path node, thus - path nodes will be "skipped" in the call stack as one level of paths - will be processed per recursive call in order to set one of the columns - of X for the row identified by rowCounter. - @ In, node, ET.Element, the current node to process - @ In, inputMap, list, a map for converting string variable names into - sequential non-negative integers that can be used to index X - @ In, stateMap, dict, a map similar to above, but instead converts the - possible states for each event (variable) into non-negative - integers - @ In, outputMap, list, a map for converting string outcome values into - non-negative integers - @ In, X, np.array, data object to populate with values - @ In, rowCounter, int, the row we are currently editing in X - @ Out, offset, int, the number of rows of X this call has populated - """ - - # Construct point - if node.tag == 'sequence': - col = X.shape[1]-1 - outcome = node.get('name') - val = outputMap[outcome] - X[rowCounter, col] = val - rowCounter += 1 - elif node.tag == 'fork': - event = node.get('functional-event') - col = inputMap.index(event) - - for path in node.findall('path'): - state = path.get('state') - if state == 'failure': - val = '+1' - elif state =='success': - val = '0' - else: - val = stateMap[event].index(state) - ## Fill in the rest of the data as the recursive nature will only - ## fill in the details under this branch, later iterations will - ## correct lower rows if a path does change - X[rowCounter, col] = val - for fork in path.getchildren(): - newCounter = self.constructPointDFS(fork, inputMap, stateMap, outputMap, X, rowCounter) - for i in range(newCounter-rowCounter): - X[rowCounter+i, col] = val - rowCounter = newCounter - - return rowCounter diff --git a/framework/PostProcessors/ETStructure.py b/framework/PostProcessors/ETStructure.py new file mode 100644 index 0000000000..7f6acd0a3f --- /dev/null +++ b/framework/PostProcessors/ETStructure.py @@ -0,0 +1,473 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on April 30, 2018 + +@author: mandd +""" + +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#Internal Modules--------------------------------------------------------------- +import MessageHandler +from utils import utils +from utils import xmlUtils as xmlU +import MessageHandler +#Internal Modules End----------------------------------------------------------- + +#External Modules--------------------------------------------------------------- +import numpy as np +import xml.etree.ElementTree as ET +import copy +import itertools +from collections import OrderedDict +#External Modules End----------------------------------------------------------- + +class ETStructure(object): + """ + This is the base class of the ET structure which actually handles ET structures which is used by the ETimporter and the ETmodel + """ + def __init__(self, expand, inputs): + """ + This method executes the post-processor action. + @ In, inputs, list, list of file objects + @ In, expand, bool, boolean variable which indicates if the ET needs to be factorially expanded + @ Out, None + """ + self.expand = expand + ### Check for link to other ET + links = [] + sizes=(len(inputs),len(inputs)) + connectivityMatrix = np.zeros(sizes) + listETs=[] + listRoots=[] + + for fileID in inputs: + eventTree = ET.parse(fileID.getPath() + fileID.getFilename()) + listETs.append(eventTree.getroot().get('name')) + listRoots.append(eventTree.getroot()) + links = self.createLinkList(listRoots) + + if len(inputs)>0: + rootETID = self.checkETStructure(links,listETs,connectivityMatrix) + + if len(links)>=1 and len(inputs)>1: + finalAssembledTree = self.analyzeMultipleET(inputs,links,listRoots,listETs,rootETID) + self.pointSet = self.analyzeSingleET(finalAssembledTree) + + elif len(links)==0 and len(inputs)>1: + raise IOError('Multiple ET files have provided but they are not linked') + + elif len(links)>1 and len(inputs)==1: + raise IOError('A single ET files has provided but it contains a link to an additional ET') + + elif len(links)==0 and len(inputs)==1: + eventTree = ET.parse(inputs[0].getPath() + inputs[0].getFilename()) + self.pointSet = self.analyzeSingleET(eventTree.getroot()) + + def solve(self,combination): + """ + This method provides the sequence of the ET given the status of its branching conditions + @ In, combination, dict, values of all ET branching conditions + @ In, outcome, float, sequence of the ET corresponding to the provided ET branching conditions + """ + combinationArray=np.zeros(len(self.variables)) + outcome = -1 + for index, var in enumerate(self.variables): + combinationArray[index] = combination[var] + inputData = self.pointSet[:,:len(self.variables)] + for row in self.pointSet: + if np.array_equal(row[:len(self.variables)],combinationArray): + outcome = row[-1] + return outcome + + def returnDict(self): + """ + This method returns the ET data + @ In, None + @ Out, (outputDict,self.variables), tuple, tuple containing 1) outputDict (dict, dictionary containing the + values of all ET branching conditions) and 2) self.variables + (list, IDs of the ET branching conditions) + """ + outputDict = {} + for index, var in enumerate(self.variables): + outputDict[var] = self.pointSet[:, index] + outputDict['sequence'] = self.pointSet[:, -1] + + return outputDict, self.variables + + def createLinkList(self,listRoots): + """ + This method identifies the links among ETs. It saves such links in the variable links. + Note that this method overwrites such variable when it is called. This is because the identification + of the links needs to be computed from scratch after every merging step since the structure of ETs has changed. + The variable links=[dep1,...,depN] is a list of connections where each connection dep is a + dictionary as follows: + + dep.keys = + * link_seqID : ID of the link in the master ET + * ET_slave_ID : ID of the slave ET that needs to be copied into the master ET + * ET_master_ID: ID of the master ET; + + The slave ET is merged into the master ET; note the master ET contains the link in at least one + node: + + + + + + @ In, listRoots, list, list containing the root of all ETs + @ Out, linkList, list, list containing the links among the ETs + """ + linkList = [] + for root in listRoots: + links, seqID = self.checkLinkedTree(root) + if len(links) > 0: + for idx, val in enumerate(links): + dep = {} + dep['link_seqID'] = copy.deepcopy(seqID[idx]) + dep['ET_slave_ID'] = copy.deepcopy(val) + dep['ET_master_ID'] = copy.deepcopy(root.get('name')) + linkList.append(dep) + return linkList + + def checkETStructure(self,links,listETs,connectivityMatrix): + """ + This method checks that the structure of the ET is consistent. In particular, it checks that only one root ET + and at least one leaf ET is provided. As an example consider the following ET structure: + ET1 ----> ET2 ----> ET3 + |------> ET4 ----> ET5 + Five ETs have been provided, ET1 is the only root ET while ET3 and ET5 are leaf ET. + @ In, links, list, list containing all the link connectivities among ETs + @ In, listETs, list, list containing the ID of the ETs + @ In, connectivityMatrix, np.array, matrix containing connectivity mapping + @ Out, rootETID, xml.etree.Element, root of the main ET + """ + + # each element (i,j) of the matrix connectivityMatrix shows if there is a connection from ET_i to ET_j: + # * 0: no connection from i to j + # * 1: a connection exists from i to j + for link in links: + row = listETs.index(link['ET_master_ID']) + col = listETs.index(link['ET_slave_ID']) + connectivityMatrix[row,col]=1.0 + + # the root ETs are charaterized by a column full of zeros + # the leaf ETs are charaterized by a row full of zeros + zeroRows = np.where(~connectivityMatrix.any(axis=1))[0] + zeroColumns = np.where(~connectivityMatrix.any(axis=0))[0] + + if len(zeroColumns)>1: + raise IOError('Multiple root ET') + if len(zeroColumns)==0: + raise IOError('No root ET') + if len(zeroColumns)==1: + rootETID = listETs[zeroColumns.astype(int)[0]] + print("ETImporter Root ET: " + str(rootETID)) + + leafs = [] + for index in np.nditer(zeroRows): + leafs.append(listETs[index]) + print("ETImporter leaf ETs: " + str(leafs)) + + return rootETID + + def analyzeMultipleET(self,inputs,links,listRoots,listETs,rootETID): + """ + This method executes the analysis of the ET if multiple ETs are provided. It merge all ETs onto the root ET + @ In, input, list, list of file objects + @ In, links, list, list containing the links among the ETs + @ In, listRoots, list containing the root of all ETs + @ In, listETs, list, list containing the ID of the ETs + @ In, rootETID, xml.etree.Element, root of the main ET + @ Out, xmlNode, xml.etree.Element, root of the assembled root ET + """ + # 1. for all ET check if it contains SubBranches + ETset = [] + for fileID in inputs: + eventTree = ET.parse(fileID.getPath() + fileID.getFilename()) + root = self.checkSubBranches(eventTree.getroot()) + ETset.append(root) + + # 2. loop on the dependencies until it is empty + while len(links)>0: + for link in links: + indexMaster = listETs.index(link['ET_master_ID']) + indexSlave = listETs.index(link['ET_slave_ID']) + mergedTree = self.mergeLinkedTrees(listRoots[indexMaster],listRoots[indexSlave],link['link_seqID']) + + listETs.pop(indexMaster) + listRoots.pop(indexMaster) + + listETs.append(link['ET_master_ID']) + listRoots.append(mergedTree) + + links = self.createLinkList(listRoots) + + indexRootET = listETs.index(rootETID) + return listRoots[indexRootET] + + def analyzeSingleET(self,masterRoot): + """ + This method executes the analysis of the ET if a single ET is provided. + @ In, masterRoot, xml.etree.Element, root of the ET + @ Out, pointSet, np.array, numpy matrix containing the pointSet data + """ + root = self.checkSubBranches(masterRoot) + + ## These outcomes will be encoded as integers starting at 0 + outcomes = [] + ## These variables will be mapped into an array where there index + self.variables = [] + values = {} + for node in root.findall('define-functional-event'): + event = node.get('name') + ## First, map the variable to an index by placing it in a list + self.variables.append(event) + ## Also, initialize the dictionary of values for this variable so we can + ## encode them as integers as well + values[event] = [] + ## Iterate through the forks that use this event and gather all of the + ## possible states + for fork in xmlU.findAllRecursive(root.find('initial-state'), 'fork'): + if fork.get('functional-event') == event: + for path in fork.findall('path'): + state = path.get('state') + if state not in values[event]: + values[event].append(state) + + ## Iterate through the sequences and gather all of the possible outcomes + ## so we can numerically encode them latter + for node in root.findall('define-sequence'): + outcome = node.get('name') + if outcome not in outcomes: + outcomes.append(outcome) + etMap = self.returnMap(outcomes, root.get('name')) + print("ETImporter variables identified: " + str(format(self.variables))) + + d = len(self.variables) + n = len(xmlU.findAllRecursive(root.find('initial-state'), 'sequence')) + pointSet = -1 * np.ones((n, d + 1)) + rowCounter = 0 + for node in root.find('initial-state'): + newRows = self.constructPointDFS(node, self.variables, values, etMap, pointSet, rowCounter) + rowCounter += newRows + + if self.expand: + pointSet = self.expandPointSet(pointSet,values) + + return pointSet + + def expandPointSet(self,pointSet,values): + """ + This method performs a full-factorial expansion of the ET: if a branch contains a -1 element this method + duplicate the branch; each duplicated branch contains element values equal to +1 and 0. + @ In, pointSet, np.array, original point set + @ In, values, dict, dictionary containing the numerical value associated to each functional event + @ Out, pointSet, np.array, expanded point set + """ + for col in range(pointSet.shape[1]): + indexes = np.where(pointSet[:,col] == -1)[0] + if indexes.size>0: + for idx in indexes: + var = self.variables[col] + pointSet[idx,col] = values[var][0] + for index, value in enumerate(values[var]): + if index > 0: + rowToBeAdded = copy.deepcopy(pointSet[idx,:]) + rowToBeAdded[col] = value + pointSet = np.vstack([pointSet,rowToBeAdded]) + return pointSet + + def checkLinkedTree(self, root): + """ + This method checks if the provided root of the ET contains links to other ETs. + This occurs if a node contains a sub-node: + + + + The name of the is the link to the ET defined as follows: + + + @ In, root, xml.etree.Element, root of the root ET + @ Out, returnData, tuple, tuple containing 1) dependencies (list of ID of the linked ET (e.g., Link-to-LP-Event-Tree)) + and seqID (list of ID of the link in the root ET (e.g., Link-to-LP)) + """ + dependencies = [] + seqID = [] + + for node in root.findall('define-sequence'): + for child in node.getiterator(): + if 'event-tree' in child.tag: + dependencies.append(child.get('name')) + seqID.append(node.get('name')) + returnData = dependencies, seqID + return returnData + + def mergeLinkedTrees(self,rootMaster,rootSlave,location): + """ + This method merged two ET; it merges a slave ET onto the master ET. Note that slave ET can be copied + in multiple branches of the master ET. + @ In, rootMaster, xml.etree.Element, root of the master ET + @ In, rootSlave, xml.etree.Element, root of the slave ET + @ In, location, string, ID of the link that identifies the branches of the master ET that are linked to the slave ET + @ Out, rootMaster, xml.etree.Element, root of the master ET after the merging process has completed + """ + # 1. copy define-functional-event block + for node in rootSlave.findall('define-functional-event'): + rootMaster.append(node) + # 2. copy define-sequence block + for node in rootSlave.findall('define-sequence'): + rootMaster.append(node) + # 3. remove the that points at the "location" + for node in rootMaster.findall('define-sequence'): + if node.get('name') == location: + rootMaster.remove(node) + # 4. copy slave ET into master ET + for node in rootMaster.findall('.//'): + if node.tag == 'path': + for subNode in node.findall('sequence'): + linkName = subNode.get('name') + if linkName == location: + node.append(rootSlave.find('initial-state').find('fork')) + node.remove(subNode) + return copy.deepcopy(rootMaster) + + def checkSubBranches(self,root): + """ + This method checks if the provided ET contains sub-branches (i.e., by-pass). + This occurs if the node is provided. + As an example: + + defines a branch that is linked to multiple ET sequences: + + + + The scope of this method is to copy the into the sequences of the + ET that are linked to + @ In, root, xml.etree.Element, root of the ET + @ Out, root, xml.etree.Element, root of the processed ET + """ + eventTree = root.findall('initial-state') + + if len(eventTree) > 1: + print('ETImporter: more than one initial-state identified') + ### Check for sub-branches + subBranches = {} + for node in root.findall('define-branch'): + subBranches[node.get('name')] = node.find('fork') + print("ETImporter branch identified: " + str(node.get('name'))) + if len(subBranches) > 0: + for node in root.findall('.//'): + if node.tag == 'path': + for subNode in node.findall('branch'): + linkName = subNode.get('name') + if linkName in subBranches.keys(): + node.append(subBranches[linkName]) + else: + raise IOError(' ETImporter: branch ' + str( + linkName) + ' linked in the ET is not defined; available branches are: ' + str( + subBranches.keys())) + + for child in root: + if child.tag == 'branch': + root.remove(child) + + return root + + def returnMap(self,outcomes,name): + """ + This method returns a map if the ET contains symbolic sequences. + This is needed since since RAVEN requires numeric values for sequences. + @ In, outcomes, list, list that contains all the sequences IDs provided in the ET + @ In, name, string, name of the ET + @ Out, etMap, dict, dictionary containing the map + """ + # check if outputMap contains string ID for at least one sequence + # if outputMap contains all numbers then keep the number ID + allFloat = True + for seq in outcomes: + try: + float(seq) + except ValueError: + allFloat = False + break + etMap = {} + if allFloat == False: + # create an integer map, and create an integer map file + root = ET.Element('map') + root.set('Tree', name) + for seq in outcomes: + etMap[seq] = outcomes.index(seq) + ET.SubElement(root, "sequence", ID=str(outcomes.index(seq))).text = str(seq) + fileID = name + '_mapping.xml' + updatedTreeMap = ET.ElementTree(root) + xmlU.prettify(updatedTreeMap) + updatedTreeMap.write(fileID) + else: + for seq in outcomes: + etMap[seq] = utils.floatConversion(seq) + return etMap + + def constructPointDFS(self, node, inputMap, stateMap, outputMap, X, rowCounter): + """ + Construct a "sequence" using a depth-first search on a node, each call + will be on a fork except in the base case which will be called on a + sequence node. The recursive case will traverse into a path node, thus + path nodes will be "skipped" in the call stack as one level of paths + will be processed per recursive call in order to set one of the columns + of X for the row identified by rowCounter. + @ In, node, ET.Element, the current node to process + @ In, inputMap, list, a map for converting string variable names into + sequential non-negative integers that can be used to index X + @ In, stateMap, dict, a map similar to above, but instead converts the + possible states for each event (variable) into non-negative + integers + @ In, outputMap, list, a map for converting string outcome values into + non-negative integers + @ In, X, np.array, data object to populate with values + @ In, rowCounter, int, the row we are currently editing in X + @ Out, rowCounter, int, the number of rows of X this call has populated + """ + # Construct point + if node.tag == 'sequence': + col = X.shape[1]-1 + outcome = node.get('name') + val = outputMap[outcome] + X[rowCounter, col] = val + rowCounter += 1 + elif node.tag == 'fork': + event = node.get('functional-event') + col = inputMap.index(event) + + for path in node.findall('path'): + state = path.get('state') + if state == 'failure': + val = '+1' + elif state =='success': + val = '0' + else: + val = int(state)#stateMap[event].index(state) + ## Fill in the rest of the data as the recursive nature will only + ## fill in the details under this branch, later iterations will + ## correct lower rows if a path does change + X[rowCounter, col] = val + for fork in path.getchildren(): + newCounter = self.constructPointDFS(fork, inputMap, stateMap, outputMap, X, rowCounter) + for i in range(newCounter-rowCounter): + X[rowCounter+i, col] = val + rowCounter = newCounter + return rowCounter diff --git a/framework/PostProcessors/FTGate.py b/framework/PostProcessors/FTGate.py new file mode 100644 index 0000000000..4c5bfb952d --- /dev/null +++ b/framework/PostProcessors/FTGate.py @@ -0,0 +1,279 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on April 30, 2018 + +@author: mandd +""" + +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#Internal Modules--------------------------------------------------------------- +import MessageHandler +from utils import utils +from utils import xmlUtils as xmlU +#Internal Modules End----------------------------------------------------------- + +#External Modules--------------------------------------------------------------- +import numpy as np +import xml.etree.ElementTree as ET +import copy +import itertools +from collections import OrderedDict +#External Modules End----------------------------------------------------------- + +class FTGate(object): + """ + This is the base class for all possible Boolean logic gates to be employed in Fault-Trees + """ + def __init__(self, xmlNode): + """ + Method that initializes the gate + @ In, xmlNode, xml.etree.Element, Xml element node + @ Out, None + """ + self.name = None # Specific ID of the gate + self.gate = None # Type of logic gate (see self.allowedGates for a loist of allowed gate types) + self.arguments = [] # input elements of the gate + self.negations = [] # input elements of the gate that are negated + self.params = {} # specific paramteres of the gate + self.allowedGates = {'not':1,'and':'inf','or':'inf','xor':'inf','iff':2,'nand':'inf','nor':'inf','atleast':'inf','cardinality':'inf','imply':2} + + self.name = xmlNode.get('name') + + for child in xmlNode: + if child.attrib: + self.params = child.attrib + if child.tag in self.allowedGates.keys(): + self.gate = child.tag + else: + raise IOError('FTImporterPostProcessor Post-Processor; gate ' + str(child.tag) + ' : is not recognized. Allowed gates are: '+ str(self.allowedGates.keys())) + + for node in xmlU.findAllRecursive(xmlNode, 'gate'): + self.arguments.append(node.get('name')) + + for node in xmlU.findAllRecursive(xmlNode, 'basic-event'): + self.arguments.append(node.get('name')) + + for node in xmlU.findAllRecursive(xmlNode, 'house-event'): + self.arguments.append(node.get('name')) + + for child in xmlNode: + for childChild in child: + if childChild.tag == 'not': + event = list(childChild.iter()) + if len(event)>2: + raise IOError('FTImporterPostProcessor Post-Processor; gate ' + str(self.name) + ' contains a negations of multiple basic events') + elif event[1].tag in ['gate','basic-event','house-event']: + self.negations.append(event[1].get('name')) + + if self.gate in ['iff'] and len(self.arguments)>self.allowedGates['iff']: + raise IOError('FTImporterPostProcessor Post-Processor; iff gate ' + str(self.name) + ' has more than 2 events') + if self.gate in ['imply'] and len(self.arguments)>self.allowedGates['imply']: + raise IOError('FTImporterPostProcessor Post-Processor; imply gate ' + str(self.name) + ' has more than 2 events') + + def returnArguments(self): + """ + Method that returns the arguments of the gate + @ In, None + @ Out, self.arguments, list, list that contains the arguments of the gate + """ + return self.arguments + + def evaluate(self,argValues): + """ + Method that evaluates the gate + @ In, argValues, dict, dictionary containing all available variables + @ Out, outcome, float, calculated outcome of the gate + """ + argumentValues = copy.deepcopy(argValues) + for key in self.negations: + if argumentValues[key]==1: + argumentValues[key]=0 + else: + argumentValues[key]=1 + if set(self.arguments) <= set(argumentValues.keys()): + argumentsToPass = OrderedDict() + for arg in self.arguments: + argumentsToPass[arg] = argumentValues[arg] + outcome = self.evaluateGate(argumentsToPass) + return outcome + else: + raise IOError('FTImporterPostProcessor Post-Processor; gate ' + str(self.name) + ' can receive these arguments ' + str(self.arguments) + ' but the following were passed ' + str(argumentValues.keys()) ) + + def evaluateGate(self,argumentValues): + """ + Method that evaluates the gate. + Note that argumentValues is passed directly (instead of argumentValues.values()) in case of the imply gate since the events IDs are important + @ In, argumentValues, OrderedDict, dictionary containing only the variables of interest to the gate + @ Out, outcome, float, calculated outcome of the gate + """ + if self.gate == 'and': + outcome = andGate(argumentValues.values()) + elif self.gate == 'or': + outcome = orGate(argumentValues.values()) + elif self.gate == 'nor': + outcome = norGate(argumentValues.values()) + elif self.gate == 'nand': + outcome = nandGate(argumentValues.values()) + elif self.gate == 'xor': + outcome = xorGate(argumentValues.values()) + elif self.gate == 'iff': + outcome = iffGate(argumentValues.values()) + elif self.gate == 'atleast': + outcome = atLeastGate(argumentValues.values(),float(self.params['min'])) + elif self.gate == 'cardinality': + outcome = cardinalityGate(argumentValues.values(),float(self.params['min']),float(self.params['max'])) + elif self.gate == 'imply': + outcome = implyGate(argumentValues) + elif self.gate == 'not': + outcome = notGate(argumentValues.values()) + return outcome + +def notGate(value): + """ + Method that evaluates the NOT gate + @ In, value, list, list of values + @ Out, outcome, float, calculated outcome of the gate + """ + if len(value)>1: + raise IOError('NOT gate has received in input ' + str(len(value)) + ' values instead of 1.') + if value[0]==0: + outcome = 1 + else: + outcome = 0 + return outcome + +def andGate(argumentValues): + """ + Method that evaluates the AND gate + @ In, argumentValues, list, list of values + @ Out, outcome, float, calculated outcome of the gate + """ + if 0 in argumentValues: + outcome = 0 + else: + outcome = 1 + return outcome + +def orGate(argumentValues): + """ + Method that evaluates the OR gate + @ In, argumentValues, list, list of values + @ Out, outcome, float, calculated outcome of the gate + """ + if 1 in argumentValues: + outcome = 1 + else: + outcome = 0 + return outcome + +def nandGate(argumentValues): + """ + Method that evaluates the NAND gate + @ In, argumentValues, list, list of values + @ Out, outcome, float, calculated outcome of the gate + """ + out = [] + out.append(andGate(argumentValues)) + outcome = notGate(out) + return outcome + +def norGate(argumentValues): + """ + Method that evaluates the NOR gate + @ In, argumentValues, list, list of values + @ Out, outcome, float, calculated outcome of the gate + """ + out = [] + out.append(orGate(argumentValues)) + outcome = notGate(out) + return outcome + +def xorGate(argumentValues): + """ + Method that evaluates the XOR gate + The XOR gate gives a true (1 or HIGH) output when the number of true inputs is odd. + https://electronics.stackexchange.com/questions/93713/how-is-an-xor-with-more-than-2-inputs-supposed-to-work + @ In, argumentValues, list, list of values + @ Out, outcome, float, calculated outcome of the gate + """ + if argumentValues.count(1.) % 2 != 0: + outcome = 1 + else: + outcome = 0 + return outcome + +def iffGate(argumentValues): + """ + Method that evaluates the IFF gate + @ In, argumentValues, list, list of values + @ Out, outcome, float, calculated outcome of the gate + """ + out = [] + out.append(xorGate(argumentValues)) + outcome = notGate(out) + return outcome + +def implyGate(argumentValues): + """ + Method that evaluates the IMPLY gate + Note that this gate requires a specific definition of the two inputs. This definition is specifed in the order of the events provided in the input file + As an example, BE1->BE2 is translated as: + + + + + + + @ In, argumentValues, list, list of values + @ Out, outcome, float, calculated outcome of the gate + """ + keys = argumentValues.keys() + if argumentValues[keys[0]]==1 and argumentValues[keys[1]]==0: + outcome = 0 + else: + outcome = 1 + return outcome + +def atLeastGate(argumentValues,k): + """ + Method that evaluates the ATLEAST gate + @ In, argumentValues, list, list of values + @ In, k, float, max number of allowed events + @ Out, outcome, float, calculated outcome of the gate + """ + if argumentValues.count(1) >= k: + outcome = 1 + else: + outcome = 0 + return outcome + +def cardinalityGate(argumentValues,l,h): + """ + Method that evaluates the CARDINALITY gate + @ In, argumentValues, list, list of values + @ In, l, float, min number of allowed events + @ In, h, float, max number of allowed events + @ Out, outcome, float, calculated outcome of the gate + """ + nOcc = argumentValues.count(1) + if nOcc >= l and nOcc <= h: + outcome = 1 + else: + outcome = 0 + return outcome + diff --git a/framework/PostProcessors/FTImporter.py b/framework/PostProcessors/FTImporter.py new file mode 100644 index 0000000000..1883464ea0 --- /dev/null +++ b/framework/PostProcessors/FTImporter.py @@ -0,0 +1,131 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on Dec 21, 2017 + +@author: mandd +""" + +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#External Modules--------------------------------------------------------------- +import numpy as np +import xml.etree.ElementTree as ET +import copy +import itertools +from collections import OrderedDict +#External Modules End----------------------------------------------------------- + +#Internal Modules--------------------------------------------------------------- +from .PostProcessor import PostProcessor +from utils import InputData +from utils import xmlUtils as xmlU +from utils import utils +from .FTStructure import FTStructure +import Files +import Runners +#Internal Modules End----------------------------------------------------------- + +class FTImporter(PostProcessor): + """ + This is the base class of the postprocessor that imports Fault-Trees (FTs) into RAVEN as a PointSet + """ + def __init__(self, messageHandler): + """ + Constructor + @ In, messageHandler, MessageHandler, message handler object + @ Out, None + """ + PostProcessor.__init__(self, messageHandler) + self.printTag = 'POSTPROCESSOR FT IMPORTER' + self.FTFormat = None # chosen format of the FT file + self.topEventID = None + + @classmethod + def getInputSpecification(cls): + """ + Method to get a reference to a class that specifies the input data for + class cls. + @ In, cls, the class for which we are retrieving the specification + @ Out, inputSpecification, InputData.ParameterInput, class to use for + specifying input of cls. + """ + inputSpecification = super(FTImporter, cls).getInputSpecification() + fileAllowedFormats = InputData.makeEnumType("FTFileFormat", "FTFileFormatType", ["OpenPSA"]) + inputSpecification.addSub(InputData.parameterInputFactory("fileFormat", contentType=fileAllowedFormats)) + inputSpecification.addSub(InputData.parameterInputFactory("topEventID", contentType=InputData.StringType)) + return inputSpecification + + def initialize(self, runInfo, inputs, initDict) : + """ + Method to initialize the pp. + @ In, runInfo, dict, dictionary of run info (e.g. working dir, etc) + @ In, inputs, list, list of inputs + @ In, initDict, dict, dictionary with initialization options + @ Out, None + """ + PostProcessor.initialize(self, runInfo, inputs, initDict) + + def _localReadMoreXML(self, xmlNode): + """ + Function to read the portion of the xml input that belongs to this specialized class + and initialize some stuff based on the inputs got + @ In, xmlNode, xml.etree.ElementTree.Element, Xml element node + @ Out, None + """ + paramInput = FTImporter.getInputSpecification()() + paramInput.parseNode(xmlNode) + self._handleInput(paramInput) + + def _handleInput(self, paramInput): + """ + Method that handles PostProcessor parameter input block. + @ In, paramInput, ParameterInput, the already parsed input. + @ Out, None + """ + fileFormat = paramInput.findFirst('fileFormat') + self.fileFormat = fileFormat.value + topEventID = paramInput.findFirst('topEventID') + self.topEventID = topEventID.value + + def run(self, inputs): + """ + This method executes the postprocessor action. + @ In, inputs, list, list of file objects + @ Out, out, dict, dict containing the processed FT + """ + faultTreeModel = FTStructure(inputs, self.topEventID) + return faultTreeModel.returnDict() + + + def collectOutput(self, finishedJob, output): + """ + Function to place all of the computed data into the output object, (DataObjects) + @ In, finishedJob, object, JobHandler object that is in charge of running this postprocessor + @ In, output, object, the object where we want to place our computed results + @ Out, None + """ + evaluation = finishedJob.getEvaluation() + outputDict ={} + outputDict['data'] = evaluation[1] + + outputDict['dims'] = {} + for key in outputDict['data'].keys(): + outputDict['dims'][key] = [] + if output.type in ['PointSet']: + output.load(outputDict['data'], style='dict', dims=outputDict['dims']) + else: + self.raiseAnError(RuntimeError, 'FTImporter failed: Output type ' + str(output.type) + ' is not supported.') diff --git a/framework/PostProcessors/FTStructure.py b/framework/PostProcessors/FTStructure.py new file mode 100644 index 0000000000..4e45a3080a --- /dev/null +++ b/framework/PostProcessors/FTStructure.py @@ -0,0 +1,148 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on April 30, 2018 + +@author: mandd +""" + +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#Internal Modules--------------------------------------------------------------- +import MessageHandler +from utils import utils +from .FTGate import FTGate +from utils import xmlUtils as xmlU +#Internal Modules End----------------------------------------------------------- + +#External Modules--------------------------------------------------------------- +import numpy as np +import xml.etree.ElementTree as ET +import copy +import itertools +from collections import OrderedDict +#External Modules End----------------------------------------------------------- + +class FTStructure(object): + """ + This is the base class of the FT structure which actually handles FT structures which is used by the FTimporter and the FTmodel + """ + def __init__(self, inputs, topEventID): + """ + This method executes the postprocessor action. + @ In, inputs, list, list of file objects + @ Out, outcome, dict, dict containing the processed FT + """ + self.basicEvents = [] # List of Basic events of the FT + self.houseEvents = {} # List of House events of the FT + self.gateList = {} # Dict of Gates of the FT + self.gateID = [] # list of Gates name + self.topEventID = topEventID # ID of the FT output + + for fileID in inputs: + faultTree = ET.parse(fileID.getPath() + fileID.getFilename()) + faultTree = xmlU.findAllRecursive(faultTree,'opsa-mef') + + for gate in xmlU.findAllRecursive(faultTree[0], 'define-gate'): + ftGate = FTGate(gate) + self.gateList[gate.get('name')] = ftGate + self.gateID.append(gate.get('name')) + + for basicEvent in xmlU.findAllRecursive(faultTree[0], 'basic-event'): + self.basicEvents.append(basicEvent.get('name')) + + for houseEvent in xmlU.findAllRecursive(faultTree[0], 'define-house-event'): + value = houseEvent.find('constant').get('value') + if value in ['True','true']: + value = 1. + elif value in ['False','false']: + value = 0. + else: + raise IOError('FTImporterPostProcessor Post-Processor ' + self.name + ': house event ' + str(basicEvent.get('name')) + ' has a not boolean value (True or False)') + self.houseEvents[houseEvent.get('name')] = value + + if not self.topEventID in self.gateID: + raise IOError('FTImporterPostProcessor: specified top event ' + str(self.topEventID) + ' is not contained in the fault-tree; available gates are: ' + str(self.gateID)) + + def returnDict(self): + """ + This method calculates all possible input combinations and the corresponding output values + @ In, None + @ Out, outcome, dict, dictionary containing + """ + self.FTsolver() + outcome = self.constructData() + return outcome + + def FTsolver(self): + """ + This method determines the ordered sequence of gates to compute in order to solve the full FT. + The determined ordered sequence is stored in self.gateSequence. + @ In, None + @ Out, None + """ + self.gateSequence = [] + availBasicEvents = copy.deepcopy(self.basicEvents) + availBasicEvents = availBasicEvents + self.houseEvents.keys() + counter = 0 + while True: + complete=False + for gate in self.gateList.keys(): + if set(self.gateList[gate].returnArguments()) <= set(availBasicEvents): + self.gateSequence.append(gate) + availBasicEvents.append(gate) + if set(availBasicEvents) == set(self.basicEvents+self.gateID+self.houseEvents.keys()): + complete=True + break + if counter > len(self.gateList.keys()): + raise IOError('FTImporterPostProcessor Post-Processor ' + self.name + ': the provided FT cannot be computed') + counter += 1 + if complete: + break + + def evaluateFT(self,combination): + """ + This method determines the outcome of the FT given a set of basic-event values + @ In, combination, dict, dictionary containing values for all basic-events + @ Out, values, dict, dictionary containing calculated values for all gates + """ + values = {} + for gate in self.gateSequence: + values[gate] = self.gateList[gate].evaluate(combination) + combination[gate] = values[gate] + return values + + def constructData(self): + """ + This method determines the outcome of the FT given a set of basic-event values + @ In, None + @ Out, outcome, dict, dictionary containing calculated values for all basic-events and the Top-event + """ + combinations = list(itertools.product([0,1],repeat=len(self.basicEvents))) + outcome={} + outcome={key:np.zeros(len(combinations)) for key in self.basicEvents} + outcome[self.topEventID] = np.zeros(len(combinations)) + for index,combination in enumerate(combinations): + combinationDict = {key: combination[index] for index,key in enumerate(self.basicEvents)} + for houseEvent in self.houseEvents.keys(): + combinationDict[houseEvent] = self.houseEvents[houseEvent] + out = self.evaluateFT(combinationDict) + for key in self.basicEvents: + outcome[key][index]=float(combinationDict[key]) + outcome[self.topEventID][index] = out[self.topEventID] + return outcome + + diff --git a/framework/PostProcessors/SampleSelector.py b/framework/PostProcessors/SampleSelector.py index 88045b359e..3d4d279348 100644 --- a/framework/PostProcessors/SampleSelector.py +++ b/framework/PostProcessors/SampleSelector.py @@ -137,4 +137,4 @@ def collectOutput(self, finishedJob, output): pick = evaluation[1] for key,value in pick.items(): pick[key] = np.atleast_1d(value) - output.addRealization(pick) \ No newline at end of file + output.addRealization(pick) diff --git a/framework/PostProcessors/__init__.py b/framework/PostProcessors/__init__.py index 33f00e779e..f97876669b 100644 --- a/framework/PostProcessors/__init__.py +++ b/framework/PostProcessors/__init__.py @@ -40,6 +40,12 @@ from .ExternalPostProcessor import ExternalPostProcessor from .InterfacedPostProcessor import InterfacedPostProcessor from .TopologicalDecomposition import TopologicalDecomposition +from .DataMining import DataMining +from .Metric import Metric +from .CrossValidation import CrossValidation +from .ETImporter import ETImporter +from .FTImporter import FTImporter +from .DataClassifier import DataClassifier from .ComparisonStatisticsModule import ComparisonStatistics # from .RavenOutput import RavenOutput # deprecated for now @@ -74,7 +80,8 @@ 'Metric', 'CrossValidation', 'ValueDuration', + 'FTImporter', + 'DataClassifier', 'SampleSelector', 'ETImporter'] + additionalModules - # 'RavenOutput', # deprecated for now diff --git a/framework/utils/InputData.py b/framework/utils/InputData.py index f6c1a6480f..b485a432f1 100644 --- a/framework/utils/InputData.py +++ b/framework/utils/InputData.py @@ -294,7 +294,7 @@ def convert(cls, value): @ In, value, string, the value to convert @ Out, convert, bool, the converted value """ - if value in utils.stringsThatMeanTrue(): + if value.lower() in utils.stringsThatMeanTrue(): return True else: return False diff --git a/framework/utils/xmlUtils.py b/framework/utils/xmlUtils.py index fadf82f7d9..86cb5b163e 100644 --- a/framework/utils/xmlUtils.py +++ b/framework/utils/xmlUtils.py @@ -348,6 +348,21 @@ def readExternalXML(extFile,extNode,cwd): raise IOError('XML UTILS ERROR: Node "{}" is not the root node of "{}"!'.format(extNode,extFile)) return root +def findAllRecursive(node, element): + """ + A function for recursively traversing a node in an elementTree to find + all instances of a tag. + Note that this method differs from findall() since it goes for all nodes, + subnodes, subsubnodes etc. recursively + @ In, node, ET.Element, the current node to search under + @ In, element, str, the string name of the tags to locate + @ Out, result, list, a list of the currently recovered results + """ + result=[] + for elem in node.iter(tag=element): + result.append(elem) + return result + def readVariableGroups(xmlNode,messageHandler,caller): """ Reads the XML for the variable groups and initializes them diff --git a/plugins/PRAplugin/doc/Introduction.tex b/plugins/PRAplugin/doc/Introduction.tex new file mode 100644 index 0000000000..10440cb6f5 --- /dev/null +++ b/plugins/PRAplugin/doc/Introduction.tex @@ -0,0 +1,14 @@ +\section{Introduction} +\label{sec:Introduction} + +This document provides a detailed description of the PRA plugin for the RAVEN~\cite{RAVEN,RAVENtheoryMan} code. +The features included in this plugin are: +\begin{itemize} + \item Event Tree (ET) Model (see Section~\ref{sec:ETModel}) + \item Fault Tree (FT) Model (see Section~\ref{sec:FTModel}) + \item Markov Model (see Section~\ref{sec:MarkovModel}) + \item Reliability Block Diagram (RBD) Model (see Section~\ref{sec:RBDmodel}) + \item Data Classifier (see Section~\ref{sec:dataClassifier}) + \item ET Data Importer (see Section~\ref{sec:ETdataImporter}) + \item FT Data Importer (see Section~\ref{sec:FTdataImporter}) +\end{itemize} diff --git a/plugins/PRAplugin/doc/Makefile b/plugins/PRAplugin/doc/Makefile new file mode 100644 index 0000000000..cfdce4e65a --- /dev/null +++ b/plugins/PRAplugin/doc/Makefile @@ -0,0 +1,21 @@ +SRCFILE = user_manual +MANUAL_FILES = user_manual.tex +LATEX_FLAGS=-interaction=nonstopmode + +all: example_plugin_documentation.pdf + +example_plugin_documentation.pdf: $(MANUAL_FILES) version.tex + pdflatex $(LATEX_FLAGS) $(SRCFILE).tex + bibtex $(SRCFILE) + pdflatex $(LATEX_FLAGS) $(SRCFILE).tex + pdflatex $(LATEX_FLAGS) $(SRCFILE).tex + +.PHONY: clean + +clean: + @rm -f *~ *.aux *.bbl *.blg *.log *.out *.toc *.lot *.lof $(SRCFILE).pdf + +#creating version file +version.tex : $(MANUAL_FILES) + git log -1 --format="%H %an %aD" > version.tex + diff --git a/plugins/PRAplugin/doc/figures/ET.pdf b/plugins/PRAplugin/doc/figures/ET.pdf new file mode 100644 index 0000000000000000000000000000000000000000..6c6f53b1a56a49f67d294345557ac1f8c7b19ce4 GIT binary patch literal 7587 zcma)B2{@GN`?i%#l1jFe$u8T>Ff&H>u}zX)b_QdNZ8U}vS+iu{vxOwtvPGrrOGF}D zl7=MNMYbaKpHZE2I-TqK{pZ3r@BOaN>v^B|zMoG}PfbGtEGZ2TY@hx#y->Jz^F@0H zKn4f`;_aLP^725b^B8w5(Fq8pKn6f54V)_xL!dle(L{_I#vbp00V*m2Jc$Gh+6~}C zTXsTwM8|GSHP2r~nem&_e?|`FcqJ6_mfj%;J+DF8uNN%uI+k^Ti!dVT!*V(- zA7vm6Zw*ezjo#)AQ;pfbaOjDli2+|>2R5=XxG*lXLM^3GHHk|<!6p|@4oOJJ#|JExvfTXGrIA$u*Lx=l zr$!f-`KvCg5k>FuA3iPmy5N2K8>nOY>lPYB7-q%9Uu~ATDoh$n@CE_rPbF<=JxlcG zd>7Xc;&l}oG0lmZaVo6X>FG#SzaFrmVGVe7*~YSNr!VpQRuFMzh~` zxewM=&tJc;dYX z_83nf><56vyAvt(oiuVCpfG7YQ0Rh6m5G#Nbl7RQvgCTH9SvVX9 z0ZN0wk{}2a29%MOmW0VdAwU^fup}4`L0AE$R9!I`hd+h^N*NJwZWI{s+vq?k6$0@; z8XMq=l#oSP1|?kmw352&UtvyG)OAUz zVZ3qn7z0h!{~ctruak@nN7*C#8sH5tj`9Ho*SUR}fCFc>jx*>-qJs43!b63;)%Izy z9T7EgFlwgNjNr3ZXFGT`#BPM?6RTcFi@uurG&#nF>0=pmZ9H?g(`;gN=HbXF_M`vE zV;a4$Aq=fVQ`#VzJX5yig%-=B$DXxs(Q&fUvY65~PdYj>Xnt0s8~WOykdaZO)lt2< zEE(ibHzwYsC*=9^dtf3T94WA$2B1C+lMfK$n5IiHtbW1;`OeV2o+=6n2SeENm~P$W zn`*Cc-7NedR@arDCiFg!+pL$XUM44|PH#L@a56uJA{G|>)oQ~pc3`oTHl4|9KIi+^YnbCe96=TM>*}Pug4;NwHb@PHqUkyD1#8mGNwO$K>oC4y9mz)NyHy`t+T` zJKAbJJkj)W{ve{Q{o$A8mgH0189RQ($!;^uW zFkwcnfNU2wCodZs|1gAJEsF6eUV#S5PcPuEJjEDGso7v_) zXc^@`d0vIj@WUo^yaPe-mN-}bM5t>E-O&U_;!S`|uq231-1ZPpkhBN=IFKeZNQ;&J zI=!S89Zs3yo{Hit8h{G;6|MJGXAhd8VBZ%E7eZhj`z;wRHXrgh$Q-1p3Sd)~$ybvnlJae@q52>fH&QIii ze8B4J_D#kYK$-o!wEm$-(&+}#hhEV<2pX+C%X2`p_1RlBpW`htF4buZBR@GD!Oa*9WDo|(XM`^2MIt;9?3>GwzE zq0>3pJGiDI%wi_(Owcdw&Vh#0?bA)pb9{g~vPuVk2`R7}uDW9?d{@bv33enaR-)a+ zHs02vGOddK6?*Y34@+k=*E=4MfLcyVh&!~L;o;$tn7W{awnLLbt3LH-#@RlG9B%Pk zvAc4*5fI1|cxob^x1&iofytb~CFJnw>p%fl0V7uK12YG9Lyxq|=m!_+68XYeFSlFT ziB`qEQmIGr2wg~KLxl@)7zl`F)C3!$ocXMT6oeX1CwIv?Xc9Qhl&y8k1#*Elf`^lo z^io8G*mTeFsH=`%K%9G|d|CN2vcTX_k^7KoN&4e#bKM3t|8xFQn#s~>YiUzmWyz(9 zE~yJ?euBCwBWa{G8_8T@o6~{kzTNp+OQ+wW++rAXR=bd56(%8PjK2lC!wvEj9_g^FCD<4Pt z)^0m*w9PBbCv#_VH*#}v8*``0bU?W>-7~u~r!!|{I%}NF58h`mpEjSlZw*S#Sne@$ z6s(R$4k)s`}qvZ>Jjn5U?p zTb6sfNwdi!u)FyM@U?BUN>)ozUQ}KaItQIUY0qDhSlWLk-#q_phphdBOHr$eQx-y% zk=dvuRBV@y;j5u>BiqM)HGRx3ADo0_N}+X~w<_P5ncdZRc@xnl|MW`!<*YAa2Nj}C zqqEt`>^9;{;)W6n;=E>}X8TOtYUV4q+g`V4x>(u!cZNGp*;n`8A%D1ERxB~$-L&L? zHq)w-%wBh)u5XiNlW#|82h6At;S5LGyl zZPqK=d#E@6AT!4b$B+akq5slkEe|14(m^uD#>XRX$apMu%6wtFM`t8<#=G6Eo%~8+ zbhma#?6CIXM9t{Vl)~g0?!JT8*U~C3CSI&~n6zd0(fMPcqMTx+Vyxn^M(@T_f2q&t zUAkTGZSva4%FOo2jxK{N!(FCXz-Qa*y&Wi zBE7@pnV`&8$G1H>CB$KvZ4m{P2j%C+oCuwwPK2aam z)KT|SPgB=UGZi^mnGUg;*j1~~n6HTpX`%N* z`_8fS9aSl2C9NRb zAk`+54?;ff#tknUAG@X8pLkxx)6B6hJ?`824Xqe~EWxX02dka$jbk-2x!w9bM?S2- z&wTIx>}1zk*G>S`5lZsHHhO;x|Jnjwc3~+Cds%Kd_UHo1s^#(IM&m-f8_-Q_bmf`G zvslcD3HPY+A^n^Bm5sJe)gM1}s6JBFpKOrmwP105;(FY5iR@Ti<~)|hXWhSH_v~HN zJ6*vyNedHZ{ujTHdp1Lt8VWWGc1Ul%50KuF=N_+_X4siokG?2>YgIPg^Xb;7hjkYh z8%@v_muwH?;vM2^AVv2Y&Q&;T4j0bfo)6}Z&-~CZ@Aq&i`*r8oNYjK{;2>=*{S!vT zQ*1o-fls&eT*_43o}k}M-fin!xHbQ{2^(k=sJ5iFn%JiqrdiP08t*lHc4!1%1W$pr z#wK-CO0-=$R!1QQtb#^jgI7q zkh@4!d|dN>AiKP>eEeB(T7~uMBSP_5!_HUZv{Gz2!s@JlwtvG?_-^dWQ*}AlZ59J? zbN4^GHOzTb?_QrOi)-BHKeKJMBQ?JH*nc|id-e(X=>6Nvl8t`ft=}lj`H%1RGZr7V zR_gJU-F4cvU!1xn4A4E3Q=(M1JN$XXvAAJ<@#ctaU-dO?e0VKV>U-*!3f;DSeZ_rZ z8Z{d8nq``OF<)ZHzE$f_R-VPA4NiqMiGF*qmD*pL8$jA=z}j`SuekItWGI~*FKR?= zzua`2ahQESNWy)*Qy4W~+NiV{yqmXUzuuWRx2XGBH#Ij_iFAeO>n9%}H^&bXTcK+^ zf0F(^8oEdIDU|#dA*M20Dn=o}C?rx9?TK*!?r~THpw+K7RPO(a(*7+CJpWDhDWg&e z{~m+(qOfEzP)Zx;;7Q@vRBb^S@1HdJPq~3a6Va}C><``H`BQNGO{}S0`}alcQE@3l zFFWF%TrnbeVScQRYQ>LA6MY>6V;wOS0uJq}WAuZ)t6@Ct2{;cTo&W&^E= zYW+WPYWbhPzwO6=9cs(gWF$U$-{d!5}P7W580mBW;4m?rrkuX6CX&5okQP^gKJ}b z7q48;ueJK1Wj_^;5=muB47}+OhdxqzHN)5UZS_N25mOR-X@z~(h1bzr6EaEBs*YX~ z%!g{13)U*gT70?%3mriBse2m3HXSqXP;WJhM9OUjlJiDMnW{6rDWak-UUI`+h&3rA zqmih9+JfuqWt^h$$Gt{os=+B1UFIsZH4bED+N7kShk*dW zKa~65Cu?^qTPQU;Nb}j-(onw!y2p-C) zNsABKA@^E02i_#_TF;!FA@zRopBVi*>@v_o3!`UD$V~pk@`~5sYFkh0Xc$-iLLtpp za?g57O(K+e%BfqeWi~)0E+7N(WncaR1J4YmDO>JApc%OjCCd3ELeX;P{hjx><7Zvm zOBNEpeE-PA^+ZoI5{6_T1|MLn0nvACGFX;+KL2v?VN$zJSMQN{9{H1t;Tt46y5X*{ zk@DKZFtdx!EvMPj`V^|w&t8H>2R}0Bbmg`z4a1)G_FEJ<6!lrYHdunrLjFOdYW?#w zH-ed6@2koW4SG1WRROk2WF`}$*393OA|LT~M;(cX;ftp~R0QryzsfJ$;uE8xoIxLh zjFgs-|8k_~8vXdG5S9>I#84}0mcLgZNgzSyUdEz8_Iq0B3z5vjMmpBb9Byh1*B!ce z&&F_v``2zxrtRaT<88Exo!n1z^62aSgHL@?`A1@f4AWI3#1j>?NfSAVeHB>^0RxF) zR$WCWg5m0FR(P7gY}W%wo#ZY~=NCK9?KH_Krb{D_PSXX{%kpR_d_AG)=X2B7O8*dI za(msbZ%TdV?0jy6<)&u&|pWo6F8MfBYz<1SCQ`@0xtWT`g>-o7a&#$Psj z%zJBI^n_@R-tOdx`oh8+XIxA8(5`4bw0bFi6DD@Aky%1r9{J)}*eoK$ zGZldh(s*A)NEEVEav&R$V!xU-G;i%wcjB^CHXoJG)GE19-?VGrqu92YwUYCK-dZax zB79_LTi?DYI`?R+-?jaz*b+}q{fxH~C#9-*w*#>aewk0-#3@3&OtJ>_FRxrM=xu%( zK9;gyiAPSIAhnTPn-A~P8M~<8cj|hs!`6fQcH#Fqv3wnKcenKg5bbT`HnRGRMyFvz z6A#3<4l-f&X0oFaQKOMAqrGM0VoKs%Om*gFjkLh8`7!jLSDm5{4zYPknJE>z$aH{H%h~F_m*&{`sla0 zYtQ`cGr!JD7(6UE<4B_S(|0nv@}TlVNm`Rl5BH`dlcZ?YoQU~|$2rr<)}FF?)b{AN zTfwSOAKPI+r^=Vi8YsNcZIGXFZ87xR{bTujCJwq1qKKDo7P)0c>y|y27T0|<%{AIx zA2jrsb158u7NuOZBuPq!7r8#wIBP<4OEf7fqG3p9{zg{yx9+S%(f8xnHAUet?@n>; zSd4zJo&Bk@%2Q*f>Oq?u*#$CJW;r&nTFv{iww;DDqvK<(k8+%3d*o|;&U<~}I}PN_ zNb7%GaleQjSO$#v>yDHCNvcE~aab<`##8jKnqPEFLqGpj6XaL!Q$!7i!Qz2BXm%sJv`3+AH9E3 zLJ>7N2nZ|-k_JP;a3}~0HV1)(DL;P=^J^4pcGd_-q+Fd~sv(r@vx*nd2~VJ0c|VGj z46lQiJ>}&UcrQ~+xkV}OtSH0}2>F?j-7Dkp?rM~LECBL@rb_AH9jImC&+^|Fr4Iq) z2mk>g(g4a0{__KbL7~!6pd;`nMoDB-CZ~RY?!PcFNQScezhg3B8OlBUI|c$#r~DlQ zgCG>!{)Iu{|1lnwwg293uk{2h}4$^Ow!1_JpLLx3nG`d@va zAjtZ8 n8Y3%X?|?x_V-x}ZyU7ndbf@prE?02H^hyU5U2d literal 0 HcmV?d00001 diff --git a/plugins/PRAplugin/doc/figures/FT.pdf b/plugins/PRAplugin/doc/figures/FT.pdf new file mode 100644 index 0000000000000000000000000000000000000000..1f549fb4e2e260a77fb008748f1b0e4625d0887d GIT binary patch literal 7801 zcma)B2{@E(+eQi_BC?mUMz)zTX2x3fJ!Idv8H_L*V~-+B_C47{$WE4ID?1U%63V`Z z5Ftzcq59tMeS44N`{yv`xzBsO?&o^$>pZWAOH)ok04yjB;A)-!I=_;)_2gY^I{*p< z0Wntg00{}8kQ&O#7H0<(!9%xzLJDXH914s7b+E*tP#3eC9}bU2vV{@_OgDjJnx--x?+c`!|}8*R9rE zDN09;_EP{y!?I3^`pAR6A+C#+?K9>xMx)=I8f!3L)73XaVkA?TXt0fIM^13XbjPMS zrzJqex85WMl^m)p-X_SnVD*SXhd*2IhzU)gf!e}rPHI>cOINUk6>9Ij8JM9%s+m1~ zkbM*6G*UfPw_D>eww1AoR8V8cGennyQ`!c)4kNciO8SiCE{o+>e3M5qXYF!ZW%BFN z3&Zc$ER|huZ(8svk(!=Wijh6=k>{8R4%K|*5e_-4kvlAwVa z%So_Xt1he_zck^y?6f>S z{Te|KeA{bleD`{NP7CYyN+#=N@t|u6RstGPA~F^~(qLEee8nO*y(gv8-ygB-aVy@P z(8L*_k--X`CN6DxTWw;mkwy|zspV`R?~KU^#;6+0-H9;?WeeNw7mE$L@`xGmnh%pb zN_0U($%nYPeT*w=nDyMZL71<(3;(wV6<*I-R%1g-RzFUagTy<>WL}q6oqtl5v%!mu zz9r3ZpV=XvxW3Zct=QW*XxMrs|0;h1<(G=kfi1fi-^2S-lJ+IzYhu-(i3|kF-V~8! z;<^3OscA}M&EcudL+@y=gWJvZL>a0j*T`E7sF zA}3hIqG}r2CCzL>+L{_pL0&LotL@OkSAfu=q4BPAWGO7KViXgbQaoSVJy6Nf)Q{t^ zi?K*~a{lrA@X9#u;>bZrc3Mb2>g9kU7)FlyTI2#iR*6 zar4V-YDJo?M|nz~i|WqLfC(D8s4JUj8g_=rT&1^m%~yc5Z8!&bv?ioV1^&cApB$u{)ii(RTE)8DsDvQZ z+2XT0_M^Kmq5K7oBxjrqKchRyCD8Sj=vgRV`P-hM0kZ{;Jn9Sb?MY%HX%Wk-iTe3^ zOLScA9Pqe!y3U8hb`-m)wPCOda9Rym@2-@kdBU38?#1bQ* zYvvN*@Q4D$T$BETTK$BA2C8Wvj&sF#rwqNJaOD9sXM)f-tx`M-v%%ui z@m^ASYPRCE;Fg`cTkjEEh{48@j>VyCk~;p^?1n1Wi~{qdiyV*}N&Y-W5+if@suX(V z4Wr-l?OY$!%mf9-I?~<<$wf8WkGVnj+6Hd)z-GyU?(@A>aS(RdgbHv>a5!__&5rX; zs%hRbd6SSqP?W*Sq5B}}QRqS@R$?o`UM6p=u2TVy#gwK-f<}7UP7&*Si2uLJU5hsH2c* zOBsv@&=`aV;Gz&93=A;^2+3kG&ZjU~7$gXX2*ZI87)%f(0v82}2*U(LMG#OR6b2E5 zh`@zSfkHA4C=~LKVSqw9ShOP^2K;SwppY~c_b-iaVQ~1MhF=Cgp#8G)WYxa|p(Yk% zt&PF~jq#?+Dd5LOdEkKfxn(g97_7Flr8Nq85-(+4z(Dw^aZ*zFIElCS6xPv!07Xw= z03QBp0`Q61zi0RrYQf;s?m{>5yC@`m5^tSO)+Ho|azk6AZYj$A?=YP9$X!Q!lrFfh z0p9TLB0G?4o5_m|_+DJ;60uf@@_kLBpa5<+xw8ti)Vz8~o!5kl!R*%ZwB!%`tv--_ zrP6F~(vp*(9}mAxwq7i*ar)_MA75C(b1`it~laih(wO8zX5xkGA`^f)FliOwZ_+AVFB$+C*rXIo1hcJdbZ>bf2$a?fQl8TK;P zL({|SG$)d|rnAF|Gff_IP1$DOB|$alC?^xl@A3=`?4_y^fM}Da^M4tkC{$1?G!yQJ;S3q1Lxu zc|Hy-NS=_(&@ss_X04cIZOw3a!S4I3(a9^m+DDR|UI)6H+^c=*(&6cblMbsz zLJf2TJDf%-mn#|*bO|aQ&)t}xF49;B5_~;NP&tvhbtTBbEVgy#{8jR>)l~-=2BoTp6eLmur?9nZcNMZfK8 z;ZMzYO}`nqq^q?QPZkhjR~IPZXAmuj1O(h0n!d*XyGp|7n|7Pl&efd2Cs0(gMwz4& zBS|33afZ|Bx)+hpiZK-!(AijTR~%F75(V-4vAEAlZW8J=ttQQ`y|8mW3W?>);%CgB z3~*>7x)@D@djf#^34)0DEht#-3p<~g01^b;SE4%e=!~Ee5&8!4b7`p&0)RAlgwXAQ zy)(g}pVvF$oBlB8b0)<4uPL0#&)-+P0SIbR1rbj8p-KHWs4itm7ev!Ny^tg`OvP+O z>M1L&iJ(2xCL0n>6?^0RJML!;*B%sP%4$k(bKsb2NlhQ@KOuPsgq}Mh^a-GTOf+Cg zF+xy%f3!@Tg;cS*drHpZQYs1k>yOhaHFQ#xFPiD6nKuad1I?S`NN;qKp6ORGAZ6Z^ zlZiIqB&24$9mD>VC|^D~n(`_0%Lt{IJ9B5w1!r15W}s_loDDV%pSIGo9JW#c4L!Dg ztfxl50JEVI_WSOiV>MKsYjCybx*HjcIweA&RnH>I!niE4{LF~usyGW}$7{wJ7H8iY z1`~*rND1+a3m?Mk?ys~^OmlB~)L)yRUH8AxE3d{}8=kT` zwzFypE@+dY*Z zxra4GT1_{awEeQn^fDEuE1M+)mF1JO609CtC0m79J(v+`)8eX+^^1LRQ6vAxa@l*X z95yDIfgI2GHI?d>Vq>T=&}BvV(}RJy<+g3srG{-X6c@^Po@btAo@u_~&8>6IAd5ZW zTZ}p3ap860QCrZ3mwmH+A7Q0vv*UW+gzwj!uH8*wPq0ZKOo%E_)5Q$Y4y6@17HABD z?78hN?MLkoh6D>E`tthR)BDoQp~rKF!Jo@&L%eGC?Z35rmi!#Yl+4t~#K@$}ln8AX zVN7;P?n<6dUW9g3+8L2o5gW}LEmoO<5|Y03=-6;ogo$Prm#pOp^h-S*FO)2{epX>% zn|-sQm&P2&SZnfaQ){1&Ap2SCir(Dug{tCwfl0SlYd+%1re))FbvNt!b|`n)54aD&B$B}% z!3~tVBRDsnx&WUc>XoJGn%>RuS(2;XSIY;K^Cr^_dwF{)db7#T({IoZ3NT>%?@ZUQ zU}FT4g5l;K&Y6R{9}{McR`z?;KSV6LwK}$rk4TOl)g16$P`wbN7}gP=7q`gNM{f2o zu~a`szw|}yuGPBzdY+V+RESiB6iuUBW1)}Gp5+nIk=y?G)`yM7{SOBk#0cUdvL(P% zVg+J75-D;B*~PQ2#4SyPO?CdPLhsEvO$OAlP=u&KxA5v78_m)qTS4}kFH>{Xb2sa7`=1S7B zGGFrMbi?NFz8_Z&o?V>*%zhCRHhEOBI)<&9&+qOli8gOCymRdeGwbC^s^buv=KOiJ zl5O*s?(@VS;_8AAVi()_79%!I_fUs1#EAjV1N!9>TTMf(l}nT_DbEsVNLNj{J~#jB zHN@Zhu@P~1wQY3!5p>uj z^5r&Vao})?$;X#B-A$Wnr@u9>L^%Q-l}0zZ6}lr(%#%)`6N6e$w8|PS>?+n5+GSqK zXiYZ=^cqt-v^iXISR1#gD7OEY$ZppE&8mAQbVh^gQ|!v5p^yIe@t&Q4wT7IXoCEhM zH&XXcLsL8a7QaVYQe z)6afPQOOGppS@qKrHyrb{P1eh@!kMo#F;h{sVlTB_4hh=HE$P7JGNPVnl5VTTgm!d z`^xs7`8~NcrOlW=#X!ZJj^-%WA@RWv@MrLNSaU>dd-?304$RTiH(76*!eve2?3a7Q z2sY8pnT|cy_4M-+qn@EY1wyJ$Gy_Zr`L^Irdzsj@+Zia=>2aDXQ!^eH9OFqk?K=g^m-xmuj?}?aS}uQ>awxp|=zg^kyD{EjjY zthYO2mRB|QG!il*uDjnQJN)Xw?P#+wxf`%`@F(j(rJ<)(A5Y1D6XFwQdxGIfu(GVI zjHL?-2|VSnw}7U<+njL!-<0+*Y2fltvX3A2gz%p-XjeQ-1_Omu(MT6Ozdq3x`0@Um zCjTinWG!))4j9{?y2Is{;P{JJpK$HJFXEJn3u(Jr;ZEg>4%QX*b9Ehu{{GGGf62sw|DUNi)0nqT4?t7_ z>%O6qxxu8eN?n9(y7}yPmZDS+2rPtRa;*4gSv(FR)1V1nk6RYU4d0ejSRZ^cqT%u0 zt@w<1XIHdS-$P&e_?#3o-&g-it!ag2b@c`s#-8z7;&dbNUY!sAJO}qtWWG6XRy}fO;R7VH%v^hpT-^*zk0)@ZmWygHx;q5M`cGV(a<1O@Qy-Y>)h*}jG zuyAeW3aRqTZVo9THl47Cm#BzxwMZ;7lC|W^CP~58_47XNuA5hG=RL>^YgLb!Kgg&V zZ6j2^?j#>L@h?99TiQb4u;1=`dYAunoxl9xr(fK(bi`}2KRm|4()L8$o(eyiQ;{hE zgNXnIgdrk82nYniKP|$dc!8^o_g=KMw3DsFiQ*KJcCkK@&7vTY6P@?7AOOKXAwsg2 z&Pph>tsM@CIH}aep&a#2dZ?i+k!UAdz=^{d{AvLM;b&I2^f>jopVOT#1fR72LA*Q@j33`+|XBC5+KG zT${Kx`iL%^-_Eq>eE0P!@AXgKy^`kZi>dlrKfDO{q$xTH9Nbo@HzZp_jB#3M-RS}e zLg0#kVA(=x=Ty_UNjOR2zQLz&Z9AcFbUj(W^yi?eMULd-AFK|)8QODZo-b?B^KJ^2 zG-m7KJ3KZgGmKzc1Belmfyh`iHJd372FiI<>kmINzq&GYb51gS z(}-lYApaJKP7)-et>I4A1i$zOB4f14+j@;9(cAw^qv1) zY{!GkT?(_(QH!p28hRWK#G(ONMS&C}2^5dxWGTY(XvJIinhV(gW`1saMD=p=JF#Bx zHEjG;5-m67#VOBh7)cS)7ZFKVJ_q-?@bgPE2y@RTlRQyfWoRcUj!f(pjT?2KUWyg> z6gxPYbkhMT(iFbf$x~m*bR;5Z$rRALlsU}AZ-<7ZSW(nEi_)WECuulgACu)js=q8O>DEyKGjEP~>qKZ0Oi zXfti2juVvPyh3(wzO)XQ!>2tc-TQNP)5UJZ#mx5VsqE#P$Y`GA);bT3qpy=V=8phB#7jSHD*c0y4@txmGb=?Z z$-)KaZ)zY6nAUKVloV&>B&o-SGS&E)po@; z4b6sP>=jXV!e#WbC@-&8*L-kM4&&%CVKTEh{l6UMzsU(03a;EZVV%$+!O$^%Uzk2_sF+6f&5D*vvf`UaM!eAIg*a!r=ivRo1 zFuzAR$tdcearhtsJ~0HJF_m`3*%)JjMlgFGn%f0h3V2_9IK4FCj$2m|o3>emMd6A=*>0onk6VQ>%( zPfJcdK&RgrSVR<0j{c58Vfb|JKQIvJWXiv{1LJeJe_#m69~cydXG#C)dpgoTFem~a zYyW}a_Zh!or>+09W+)7A)ju&f{0~e-SoBW}^ym6Sg#V>441&Kef1ekIfc>!+7y^O6 zU;k()3c`EzKe2!5EBfc&!~eF=IIJa}0bozx=W3(9@b?{08g5}Qc+&mLA(WkLFn9)j z>S4-qc(y8Gg|f63h9giA2oiyS!of&OYp68{j)Yjjg>6LO2r0n-Zt|1EyWsFo)2U;@ O!3ZdTi%VWZ0q}pu=p>N< literal 0 HcmV?d00001 diff --git a/plugins/PRAplugin/doc/figures/RBD.pdf b/plugins/PRAplugin/doc/figures/RBD.pdf new file mode 100644 index 0000000000000000000000000000000000000000..e179203d7e88324ad8f45eec7986ac580a6b5d49 GIT binary patch literal 7798 zcma)B2|QHo_eZh}AtYOM?UZE}gCVj@k$vA9V=(r{*pj3yVI*6mBx@u)*~w1I5}~pr z+4m*8{AX0}`}X#JKEMBbhPltV&pFR??mhRM@Aq6jZDmyvh$sxi*EF#@G52ikX?If# zNE`qIuGS8q^XCCE4YZ3L&K`ghM7n^OD#i(i#u9#=kT|q58s%z(2IS;G?l>$O=?wBF zO)~Cq;gn>k?lQuS=rZ>9-dJPjiYGy^vICSI0$kxL1stz>xayK8b3VYTbkCnVWg|Ig zQ&?HqHfFa)l1vIuH^I=P*t z%Dx}mwJfH*RU?d$J>Ot{^UN8l&o69uR;2kNG6P3*mF#s8Ok}Kc2A3cWYZdr#In~GY zTwZC7zNc~`K{iccdwIa^X zi#fi=-)~#RH%zb_T^W*pFDf{GoNc!6=@%!4I~Ze_FWVeT>b)x}OuYy4j~~FO?plpf z^mL{ly|ZI@mkyIt!EAJK2-usH2$aTT)RdKRD__VNcwMcqwICYE&yLP6tOR6U#Q|35 ziuk?-eg0Oju}}cZonVwwuR8j*QWr`0mE>Di`EMHpJEN ztjdB)RvIo-)cD9|+C#s%-z+~n*fw)wI}vv2lTYQrH7hIF$fHYg%dexCZdY}6uq|qm zvASiDc-Eh&z1JSYR7u|Q=Cw@D7_BMg*}7{?tyegYC<%lLECq6|olU*QXKNjS=aA3R>d<0>S;>m#VJ<|A;!0n~>w@5d;( zKc0p{66}lV`0t*w%q~76?H%;wo?ig8^A2D12grnFSoLzdq_(vTHo#^v%{=61@0Z)Gs6((N{Es`L7q zhv18dR0}n9&a@$~n+XLq!SCCwkO=_k6sbNKoQ`h|K57)fDkR0Pa5pDLe)>6;x|>GB z;K(Ol+;i+KvWWj|32a!cau3BW0veUZB=_%Cdfm!9S;`Y){jakEI`yT^S zrk}84))CX^&*Vd<851h(B1?)4UJjWB%iyatlq?ul%g5;YnBt4`5F>)${@>K@;+t~7d|`j+>_s4)sxbeJ8dRQ2F2l3{w0QK*L3cBtmOd1 zL_*tyD8tS7>?Z_fhDtwrv@9-{8n`>Xe?E^$U-{@#Q?}ZTR6;K>T0i43Wf9SPL>l7C z?8@CThikUDj`9E-1vM1qLo*7QXym(jut#NUuIev-L9x-F6t=UIKR1&%bIoSBz4v6? z(~!<~&qkDTHwqntJJBY?SV0!0wnHMi*%)wF41Vf3dh?pzT7PFxLL7U3$>u|&B})9c z?NjJdnlQ~A^OrldjOh569!)*+`k$&g2cwKQ+?vcDFm%0M5M(GdzJK>4^?j7A(aY(m z%`15pAB8U}=f0iAAJcIx=Ij`V67#MtSQiTBVz14t+rClg$GB{YY_gFUwVvQpZ2V?c z>0P%)<#j=Jc(fPQGTgPfO#NVV){LS?Ez8EpT8Zf+tb@TUQoS%jKLswnu46W0AKc$k z^%fi==aN%Oe?Y2wr}Y~M?PBvYdl8O^NspN6h`$gh7)CrxJcj-wA?o|Mp#d>%q}}0v zEZPMJzQx5qmU)!&KS82fHP|fl3nXz1!psIAhXV;wi?=i*xnnj7%L(n`EpEq;$rluf}X`$KV)crf*`NpH zuh-**purF*OA2-B3-0lze5Z|Pljo}16XN;5rm&lIvQ>*e!B=UIB=U`=;mK3X6Zk&c zrQM)J*XXM!l1yw0_V#W))*t~hKeT!?#h>C73RPidTMS(#BcY0ILR4ev!nYCjPe{Bo z?Y@SQhpJm=9v?`+OiZ^Cie$0gwl(BjY@_DbU2l!9^%B~9;FGuVT_32B@|ie4w2xZI z7TUUuKO^0`ljB#Y}@r+gBAYyt|tokCnlp#^Do3|SV%T_O&^^qu8lJwDRHKd zpBQ_gwFHo?9wjLmdAxQy*vT@w=?kp@P5Au0lRIg$qv*M5$=&4JuCMEY5t|P?7IzJ0 zGEZIz;88yTL#s?=KFhqU{N@cUU3{0#{wBl0_$vhFZF{5egwQTmr)+bTB~juz!s~M3I+@>`83P2= zR$FbK8(Xh61&FVNP!J z>X;i1?JYHVP;mWaFzHADhU)e*!--Ud=l5CC=^rZgGjNzv`6wx9OEDj5Rtmk(5G}vm z&7Z}3_RjMZC2iSt9vpimmBpQ%rq|L6VoZJ`~7sQmHj`d{*{8 z@tE@Dt50K>%30*-${J3NaV(Pv2U#`5P|3GZ9qCaurs7ypR=jV_OM0BkF^W5#EL$b< zK3zITMa0FZYhRC0grp!7SXo-w#zRc-W7dYqe(Ou%_X(&3LyeP@61EJmfbH8E*6#~5 zjRjs@@T8VF{wPAE$?$6ARkMQl!Xtyoc^OW+)>mv_INffRvzkL);CbX_^aJ>+TXT)a z#`ssftIv)wFWsiEcVD)?!Bq?L=kz~48hNIrPT)SZDY@fqdaefmuM@9613T3e)qddd z263H$EG-;&FoRE%xwTN?y+MU)bx!^(G0f`0yeD;eg&vj!=&L($Tky;B*KozOOWUYn zSxw|EweonA0V_WG=nLAhg8a-{mpD}vKU_gvs*v}Q_fg8wJ(lItXZ$>&GRah{M%nL@ zpO{(E8a>pS-^_R|I+u&y>c?0dii?2fV2X`6xFek z2H7BWmBiFI>$}#8)}hvSzQCJx_^P7=qRSYyvgKzA-tlE{u`Bjw_`EAG(JYbvi2evZ zr6!ra+Z!{}ynbf7W}O<%hc=rKoZy^bpD2ElnJuUcyCEi+MZ%R{2 zZaQH)RcZ;2d${;U-ZFgRcvgRcBQzJ$s$_ROj4oDyiCta zQR}jpv6xl9&Qgk;PI7K?dYxLGng8oo-N47I;R=uHvr@uR>X1*6X=5m!=TSL5XVXm6 zWLl(9FRz8I$c>xv7lbCMN2^D)Yw8X5jp$#k>?-M^b)2;47teuLwWbyfnV7s#?SG1B zJl}RB&F9hfIU3n;JW1ben@XPr<4{zxT)IrpL-V zSMcMM0=@!;z3R_KlT12=I*)ax(a@e;KG`S2itV{JR?dly615ScZSynmZx?GcD2Z*$X`%TgVM=W$qgyxXrR=LM?J_J>q+aY zZl4i*XT@vY$3mvKr{o!Qo*GVTabm}vRa98~1^>KHxqTl^+S$__XHJbW9E378WKU@1ty@)iO^^@7RE6wDPqmzz zideDOLhnV9#|IV#_9(|US%jk0^VCnMkCSOBlz#RovRV!JnjL#THcbeY-IM(m6IilU z(yPda3@bQi0h1M|5o;7r11nX&#=KuNU`mzmiP8{sH?gfsxc6P-+eJL@BfdK(G{p`@ zBX(-&YdiWQ1FV>Ke3W^_)?PA8lehHZ;; z9X`c#TlRdj?)Vb+MT>7JdT!Lj&uDx2%|_rtO~yvXuGeQzDzBm8naVZeht?*RAG-5C zTjWl>SxsFnt1_CeHAI?SyGoCVw23T%W);<3%6CwE|70BUUebso&6IMdnK`Td+cvcwa}}JMkwarI8oTCFXDjRM{H^?z7cQ&(&<+VdZL!u+1k z2<#_syzDj&Gfn9q>JNRHE`jmY@Ex7G)thvx^OVSiM+Z07@f!~li(lJa?uzO>@%RxU z+3V)2VoT6Uc<9LxX(OCmWy#lfNqGf%BOL+p`Iajc*z8X=yL$%lId*vn3mLy8znX>M z{fPe4RZkvR&ELe#lrA~f%(xZrKN!!wS9{cNYR6(%Y-FR-Z{prT633Bnik(H#THgc9 zA=w$fk^LUZY~%2+wY_1$5RDBT4$d;zmU8Cerv!syJmg<>A=;l;=6W{!R1O~ z2XWi^T8&4$vb)ZymZ;9E<*IezxADW*3)h>MJMi(n<3V*o-(POV_2eYq^xCblvueqW(h|dI(XK*@vp2oq(^2+=5{5pD_7Px}k)`A)Q?9e((LY zU`@o@zi;9Yii_!aSmO@qiayo@{bP4TFMga<(Ns55F+Qh&#UPzD^?#suWwbjAi*dub zVnL9ftVZ9}zy)*I=z;vCF{&7>J5I?Si3MN;%&vj_IS>~I#EdXDID4X3aJa;;KhQsp zf5`v|2}wXwQVNiS5=ju`cmMyR{{;DeHlQq`Ivkq7v?`n5K_bt>xvoqeKgwdtYM z+$9aoC080}Z7psa-TatzB_+H{DmqWN{X@mg=eg9eHQVZN&MZko-zU#I;4x(r{Ifuh?h^Wdxe!j9$N&&sj128Moa^H1=I2Psk`Zg*Q=xx5 z_j~+n*L!(&aV$Ots_GT(=X6+(GLX4&yWCC9`XGnkvD{CUX4@2?Ca)%osv@X$50P=&H|NZD3 z3VwO|iG(#Vwt!Kn6w>dEpWY{xbXd1;pW%?qOE7UM7qaTyY zElu?fBOD=2s@O!TKYN>%=h59Sy3MKFVFY0+460> z`%bCoahaXgI|Un>B{nu2yFJE~8R;!_c*&;+FPjrS)Lj@-9rcasVH}lA&i=YryKx{! zJu`YIM>gA(DtHYb0R@6jQ?zVNyE{)dH1AT>&nBeQi=9{XERqOXup-Gm1bS&%Vqt3W^)uMh#cz$GD8xw^)zdv4c~MD2v`1)`n8uI$FSSMS=36 ze1kf5h5WHpZXqUZbGI9NV-BP)yt-j~1@ea7<8R$JZoCR{SprJwcu#t7Wt;Ue2mT@Os=|`wCN#q|66PW8eBX4Uk!sIF z`mJ(>Mwra!WR~@5p&+Z~PP;h1b1yQ|Emv1xmv(B=NyKqf&u?^pHnuZdOCw)<@Q$oK zqu8h#H=RB-B!LKx6(0!?YEgvOo$Pz?_$HVXyOLp=R)z8O{7Zwv|zAHA)5*^Cnx z&BQ$(nBf-KO+D6pYm7DHzCYuM6`bQUKK3ZpOvV-Ez@3liG*4%QZ-2dm`kR|0D^IjG zi+zO>KTbkbleUstEQ*ihYf=!1HaR5Av zQqGBmyh4$mO)`EVx#~dk;oif_lyB$f)^1dGSe8eeC?57Gn@>+FiNzw=0wC2v5JCB7 zw46>(sRHNN03CCI8h0E$=Y>XJS(fN(sf-1D@vu|-@GH-kFN>0zY24q0&aE8n=VIY^U())`y{|>dix2pm&cI$G@>5PyU2}2bU1M~1L|MB#%hn(tHPG9b<*bl$ z-|yZ{sEhSK=5aLjs(oCfSjI8NT?hVmpn${jXdd5+dH$-_Yxp-6r6Kp<&azK5jAzd!J`fU^V0fo2d)| zRp#Fvto%2r-PCUJ4)wfF2O)AlyPS;IsdZ?i?n-da(aqtRWX&uPa*xq4@psDIk_qw0 zX4TBLjkL#Hbi{3}K3=pCnO|eXDwW$e?$P&NNOaK^!jtZVZS}R3r_(qFzYsHB^C?$F zotDk;oQb|X!=@F*giX=<;yma33>*{7eW~GII?dqm8(LL0a}~R}JU~muQ0ps(TQf-5 zq*f#ymF8t=X5iahN~*LW-v20KzpxqvCWZK`ib?&%MS`{%I}a?{UFfefzYvq4CK3f` z=>f)snj{K=ib?=F9!M{cMaQ3cPLQ(rMBs6JlJWz!9agf7iE1^yjUc(Wv8vy;;R6QJHTwRn2EmaT!Y5nLz zyV?-Pz|ZmDrO+FTwgrI!C=5i{*v}6j0f)ojfGzM-CJB}x*d~4emtQgnjJW>qGI4R@ zo`08t!NethmqDP2KXf5*@jqnZFsVOu4=w%6o;V!-=l&$b30431b;V(S`XLSn|Dg+q z5Mb@UbtU11X#3qB90L7g9vnshxc{0bA@!%Ogw%h`Ll9uhzjYD+p^HEeg5meQNP_=) zKX6zi0r_ALU*_szt`qJ%0W9gdx)M0Z&wx;Ov2`U7lEdIoS0;e>^HOL@al%(z3L*iu vmXtyeE}#@lig0}-ZIEarTuKh~zo-0w^zJyq({vbEk`O5fh>uT2OBM8gMFtwh literal 0 HcmV?d00001 diff --git a/plugins/PRAplugin/doc/figures/markov.pdf b/plugins/PRAplugin/doc/figures/markov.pdf new file mode 100644 index 0000000000000000000000000000000000000000..7a79be3c9412a095c6ffda011fc4c9879207c5f4 GIT binary patch literal 12228 zcmeI2bx<5zx9IWUA;AJ6Fu21EJ~#~S?(Po3CAeE~hu|LEf=eJ+f=eK1a1ZYAhMaTm z_kHKwSNB%Ef8W&9boc7jYp=bet9$m3Qb9zF5yZrbO4++{vawfunlsk>1(gi|1lSu{ zq4M$qm}N}u%$+R&>`;*sfLYAa#@W;ny0tNMHWe{7wl^^a@bjZOIXjvf+M>F{(yJh= z(5L`mgjpfvh)&Yv!Fejd4#B*L_kL4(>r(;o6TL*ic``YK8DI}6p+&g4(i(7h**p4@oaasi*tJ?*jlo3AP!4yFKR z1w-@SzmBGM&H&aw{8ln`vUhPbHgy7U{2_qY+c`teI{}{N3Z0m&sfnebu)RA#8weF} zfLH*WU>03eW{9J`!*4MN%*F&_0dfLZfh`?Two?nFc*g|fLYka z)YRl3RshT@j+VAiG3tL92QUjcI{!DyO7_ms;6lBC2KLWMJ~{nYLKGbBjg?KE0ou@+ ziiiQ2C2gVe75;Mz`Ex4?fO?}LZ)9a^>R<|^PRs{gZtqR2`_!jxNwV zKA#S3raRzw`ft@V~Ew6~b2!!Z5J^ zy!AZw#Z#-WgIWJ-%is0-xAp;n{$Kj%U)_SrEMn?vX>2O)Xz1}DeFgZVum0X6fd5U8 z@S&*Ss5dsLyH=a3D!>YHh~o4i3JD6DIQiq@2Sj|2cnyGqBl|{&pfGk+IgTY8CH=D( zM3~YW^+lxLA~T9=Tm;)iFr1Jz?ci(n>8ZZUAn%n$tf8w@#LHFwbTG!5;U zc4@bYE)p6LR_e>D-naUFertgl(($-!ZB@l*^5{U;^<$ev;vvF^H)XvrV1=ZEZ+&6P zj!6t|Uni=K>mka=nJb^y6; zrHl#Gg9{GrPUo9&#P)9-hCB`}*5|H3EhD}&nAKwq?i5&}Tb-mjT67g_!H(=IA zi+Ix(DDF8BW&cvr!ccRttYCX(Wu+u2GrpERt~Y%)jLh*(N_wkdzjptK&coeqvH6mK za}LP-OX`D|8BSygJ7pnaM6=^{TOTE5VRQGLI(*(B`(7R+ZCq+_(u`b^l0*oWzq!o9wZ>JU$lRC3(t<*wc*~^o|r+7$YcN^$v#U_8%L}(Q1@p`I+xzD zaUi{jXm(@q_lY{3Ihv5fnz{hN!q8lvZ7RKeRC37@%77U@oD z?j;p_lYUpxSt&?<*I@2oPKP;)fbr2j<^Uh5XJM_M(2*yEn*2v z)xrB@gxR9CbFDAlyvq{E8VM)M{B%Fa^Y$y_Yt=!+l2(pA-{v<>Tn=Sn=VmQ$YVN+S zwUA~`%>@;ltUce`PA2sx&Oi!gDkT!_E*7F9T8bl57o~_uM+6&2CnOz3|>(!0P zN?5zWkGY@_c5cd>Db}zSTvn{}gj#sfh+7_{U3`{w`ov|v<7kb+TMA3Ew_5qiyWgCV zLljq&1$1^R*?QJ(w{PW7JutW$;-uzkv;v-_<@w%1h6tWbZ8PP=kIP1UqYbQzY=yg27p&xvDC?_2jyeP!!$S=E;S1GI&gMX`)E|jg#8hkKGR@AGz+yXI9Mi_A#By zCvn^2XB;X!2R5d)dso~wb{!Ga3VXC+S01;vbpu@%(7TrwaI~;tOFWs(n_wEh1ibcg zxKG~uejrw#ZYDkLRDLPg$DOvHIhmUKUR4^F7`Sf=RHcO1#?Aa?k8KdSw2~8{7I;W@ zmfI;`w(|AS{<|I}w`|h;d0;c$vlXO32m`JGfVKp6WM|o@nieml=8_1w6ED_+5EY$Y zS{`uvSbR!W&+XdOu>&UxULy#kpVWJ#wN&e(U7MvHXO%ODA5hQu*2>DgGU3>tiT2OPL3YpC?AM9K|=`&9ds)rDBpLU`Y@fqU_|Dt-2+E{DWnO7|mNHMaOs}ypwmL5Vvh9?reS?JHGBe!=ol`o)$%I>?=i?KK(`2M%ZSG@X8{6MOcXgJc#NoY$dd z3d^Yyeuk&F@<-3b+idglDf7a()P=r4z-Hnq?|Jm9euNTzzIGWLgXRLZ7mG7mxaM6!v|qre0#0SvHY-7_uECU35ui zAPx4m8t)nXN9kaV*BBx<&wc{78LK7nZ}ea-(xcxIq3Bs7jZ>S}rsI6dKSwcem|E-M zHM}djJrPW8e6vFD=bq7>-Te6qGVpyrzE%iuHMs0FL2BhtW0*>BL(Tt-sOz<{;nj<3 zn^e9{Zd4{h6k^TuV;GHJ#Pb$sqTQCuMjA1iJ9Z9J&J}&b-iBK6UAD06_*a?d0mn2- zsrPaRj@1vZwGesoR|;GU+zEVocx8VS&#$PZ_@AD~ZSJnkzA~2B)l@8{pU-Yfn2(TswR&^PQ!dA z`e%uHuM=u#eNJ|g{EInnyHoe*KUOth4r%YUb`dUR0;P=+BTOhP;0)NC1tsk0PZpf` zH6A8e<_Z-Rqj6t+(=P}R-{W~u73f8@LCJW5jDm6ckxex$yTo5*v>{y{F^DCtp-_e@ zY4BqgSi{KLlr$kN5O-b;K*ohGngKA(2^LciXdfli;^e;_zx+AS-rC^z(QhcN3u78} zpPdwD3XLYoVW>DqCk~P7$IBAyoZb_bS#H>M{GhH8QJ{9zM$zARBCFAz@auYV12zh?%dL_lZY+ zMERL9zR}OGk_LIgd6{uv#iG(fH~U}qo^zj-ywa`%4pr$Rjg`#b{J8XQYmZ{l%5%8c zc+EP!qhr*VG@El#I+dO)h^BGrg9Bn`sg#f=S~zK&){bNme6|UnSUQVwk8d}wuREOS zHP7r?+t_N$Y7QI_cC^Qjc6n18&oZW^{?PnQxHKaSx zDFLW;kz#+ao{5p^s%=!py>dd!jljWHZ*#wfCOtU);-yoioq#~b*RU2(p7EgbUBZA} z0MgOp+2n@xuV3&22cM6+8P~=$S6Mu=V0x9cvO5IeGngq$ zi+KnY1B~fV>r3a~2;Fsthq899sc(ksH=b-(5ZTnwg)Keq-$x~@=PbL*VUJu<(I(h4 zGUmxW`T-~kH?n@fMS`;znBPv&eO_s~i1bLjY)G*5RuQjP1d(q|()Vu0S-jH2&b9A4L8sf2X9>BH;LSYXt`>Evr6Eu7wQ z?H`qRlehACtZz1gA4)7Cbw5k@v zzrf9Gm-BKK*}zIw9VfFA-p%kj;*;`1rndRy=5;o%b^R7v_zZtlpbb^G&HYqERgbf%zm(z0v3Hp!355 z*J*cdAdl<*ci;7|g?k5&9lpl~%IN~__g%u*BBj*X8O1citm5%CKSDPXt1yZ$c%%_s z2IVHzQgG;#TyT@DrO-A)YHPfu`{8uaC=PkaxYa&{@JmU_hZ73~+&>=oF8VwlRx98} zbch~iTvF6+(rYde z$ycTNWE=gOl>3`T#PaMrHeF_7awm56_KoMCn#?SLG4+VzP1%!= zY-RUW_hV^=9pv?`9sTpG>yXS=AKrHvB==pv)PjtT$VW;;^S(r7hMN}EDQ!Il*Cv}K zk-thlB3a{tk2RaesGyOM#Ox{}!W71d8$6gQcS21FNX?%6#tD0Voo199IZ%7hh9Tq0 z7LxG3Z|+SS5}Tp(nW|h77*&>gR1ph1F&p+M8kh=hLLl7>n(_ukJUM z(Rnx)#-_Y%w0(4>@Ed(c9%W}5{MR&Z9ir$Nz&}2!v*kp0)XbFUn8|^Mg~CDys;g3i zD(=`J%{Msv*ZZtqPKs=WiQZbg2^Kx7{MxH za9PaZhg7`6uEjJsB#MdNjJ=-SqaH{{a=xm)T`rdO%(x3I=Rrg)#&Frfy>m}k#1EAEW4in0#gyvge{kz9(E?+h&g~VTvngc%SG5gUuvZiE&lG zS}DJ^szGgY%IBlpKz#m_o6<;gUx~&}WiKl~f!_=`D;mEhf2BYcm6p5m#2UbDMc}#QQ;-ebeogNRZ%Z!yr zX7wQcgnXbwO#Wv*l*E{18>e6qfgkPU_2*(^H(5A|f(zLSl|&+~i@bW+Jv=*M4ZnmE z1Z4H3tG$>e$rh0cv+~kTB1qNbU&1I#P!L+J;6u1Dr6$0j20OGdRlXd#XbMeUI)rbJ zIzs{ZIt5ol8vOZe93vM9qCFzaDyF!k-4Ycat2fhm z>tPk)EIdWIZ~Z>-gWhq^{VQ9CP|A>FIBB=zms{lTOu0!cb!ILGVHoFiii59(Tw`30 z?5}tB5->ZXCpz917KTjKdF@K|@cL^#6gw>NW$N_kCe!WQr`n|8J}YH7Ohr4+2$EHU zBO&Ay@MeQ3motB6$6DShqUs=QTVLkj#vYxOSr=M1^wQKDdB#VICsITYKOePdufgY{ zqF!Fw^L^LbuQ8g6agK!W$6>{=DsBRAk85pP8TVGYB_8#>6f$#Lo~XM}4Zh0Hn(X6= z3DjYC9U{wyDq~pXnU-Fc~OzM(GZ|_uqaIa8fwoXnMw{^zZx7BXaW1@vr%n2vu5|+JmC6wVv*pZjaj>N4V-J768c5K z2aO+j!!F5+3eI!D9WX$MPu^pSEwtX$rG0_fDjw$4K(;IC%zQML)Gd0seq$_nR$Kdp?}$!)7~Vs5@FB!1%%x@LRvzIFaOPkG*mG+5}OJ-x9lhC#H;TCyvdADW&iJN%5UXg=Q$9_Qq!+Z?DtJJDr=U zqS0ZhWtWC=2=lx%_jHfPx?k9W?}`X%wWF4-&@;cga^wq&ykr_&EDk~}K^7m3@Seik z9Mo2wMNJy*#%q)8(EV!UH-<~Gr&7~}mGZMOA;P{F4Ys;x3Lo=1t% zdgRH$%`^LOtFIL&cj({76z>{mq*Bk_hDQ?P4}P=r{A7XA@kvd7AGoDhpy#5@in#Hd z{QEnJa`f46jr_5y-ltisHfr?Bk>i_WoFTs5AFy z;kZHitQw_N9G;pGf&6Zi?z?S`UCgPr)n`L*!mp|H)+iHdxg9+3pP??j!QD@fKIR-j zv|v;zaShN~z*5Baeir4n7%2Eqo0dRL}woZYJcD9qs_@j)W@9}weRnx{ro``U{ z`ATkU1i%<`p(0V5-&f@;J;r!u#s{oZ&2q(OI&-%qHkTmp^OkK=1OFkvEM#{?%MLlB z+rC()8;<6dU1lB(#`jtJ_&vkwY0+Py6lnL;sBb*bZORi_Z z_|)+Q+_P7BlroQ4VWI>}25I#wj7r7|C&6eXzN+_)%*oIj>f`AxE+$rXJ?~3zNJga}T$uCBm70ca z>9Y)hn=W-F2eBrbPrVGFVH)MrZy8F1y?UQ84!n!WJy1dyXTh)2i*v8xUax+$a7reh^AQ8q*zglbRuX&54} z%Qxl4d=>9rqB6bI(l@6N#PC;qO6m?wycnb) zHAMIM+~wU9+MCIralbUsGw@_Xk5p0^PfHSpC z8=2Q0DzIRlyqu2HQn6rpc=9RdCI`{p9NoI?veqs$yhkkfGqbqzV0$%*6yB)uI96cv zIwDE_l<>WcSQarQoo3sd+IyFbENRiC)4D^xfYscESl=yT#Q5lv7WX$C-LX$_UZzdf4|7)MW;Ng)b*$ju4W!ThA3$ zn^Do2(oP!QcZYb4lZq4EeQA1$cGzW4&!t(}=rQ{NJ#voC-H}7lj|m3CJo zSXG=US;*8j8d*py$D_n+uXqk39!h+y4I3^SiaHbfC3YV+gsbUSh$<+6&_Gy%FHpms z$0)la9J>@y{{~=viK!#aCb|5;rrXauJni!i>yR12x**d7rJ&& zx`&$vD(JAul#{QLXT^I4=70K#GLmx8o6F$}EBIuJy|NkrN-7<}-w>vp={R}n)1b4{ zvF_raWeq;yMU;bdIcXl7un5c6Q1u^hZMW}F{kUm%#Fo1ID38@V6j47XpPuc9R9U<4 zUVOTmc-y8E(G;r;W^TtPXKOd?;BA@4tl^P2L-8~q2|78AP-a46NO^NEu zGbn-`uUU5kCm2}AM;X2Q1Chh$s30=w`-5LYgdF4dQ1OpS?XjQ=BHhq!=D73r`#toT z8!x-`-XZc=7kjK{CthGDt8?HCp%x<}yH)r%tLlD&yt6No!c=B6Z<6w_VS~v#%8!FG+P%JflgNU7{;}rV<+6SNpe1d zWu?zI%#?bKUBh_pb8yU!hz}$ZKRd$zxd!-qaqnrtj|0pB`ma^Me@77hTto-`$sNc+ z0R(2DCuYL#X-)snH9aM9=z9JCgi>TByQwJ8;f0R2bGDDYBm+>M5qhEkCU_)p5fsBE z{S@Frf~Z_Y5XEp_(5jiJbis;;k{OHQpaljP&7z!ODSYWx6cOE6jIu^Ks$f4|O26;d zT%Ox%nw>L0@}6yhQMe00=yBG7^<&G`z&YIO)_I9N+G{3wK6om9#d^Za+u zOxsl?(_uC)X(lJHGG$sNHllu?uWORUGWF0kdd!Oskt**TdGBdsflvoJ+AJ#K zs+?k2JRi1hNYHd5>{exj$dorP&|^#-Df1Wz@|S~by5U~NK6B1NW%FkO!Z8?N5c#n> zz%Kz{g8U?~;FIB*B;YIs5o(3_zr&ykfxg4K23k45O!<3`A;<-AI3VdDsCQvFprQMT z3!;W}O9Nq-{4J3KeqrI}36;g-6+BN9p22#ph3o+lQsBaY?}LQLV#Nzyk5QGpq6;j` zg(&cykvkK%AnOL+&L3p zCp>=4rXKv&*S}yGg7te6kOc>j;m5@^kY68*2*+wrz`h{0jw368D-}(T#VmN;94!&| zb{!rmG}kco72X$uwNTBdRUD|8+p2(R59b2K7e)R}_vH z8PSZrY6dX|+SMsF@ZSyhd5AFky9j;~Irw~frNd&!UWw53d^W1pZ|^h4D%G)j8{HDl zQNZ(Vr(Z_zNIOt{iF}EdV@SSqQpcibAy@}ICrt)W*ifio5h8CP-v_)kk#{Bw z!Sd+UF`}(`|6Qm}l88z!0Y@@~0$+)OHm%-YMbe5)mx_<7oit&P+eF;)m8PJcd?iIT zK%eq?ynsR?4Hb^O6p^U#oE%uHS zSdma3XPvZ{;zcQ+IGf^@qR*5~txxJJ^-y&82~M$Fuv^(bzgjIq(~bwf%>JwSsiRwaVMsD!B2sF+i>&F0ay(H|UDmU@qEv`N== zc69HO$db&GV3T6XWK`{^ai+7%Y|G@QfL2sihE{V{x6@4J(WAwqZXZUo^w}QQ??QL0 zTf#j*-B_J}-sRg(AWSFhAS57EB}`%a!cLHGmp+)jk-o*&UvHs>)_|b3p|#bZ2TV#k z{HkI`Sr-Y;t*AUGW*p})SS;tOFfOUnFwd8(8^P9%XVT^rYSL*^b`zGDX4YWVuTnh9 zKX&g8$=3abeH5tyCNGOCAE(RL%IEpQW!(5S;+TI; zo2oiIOEO+EdQevR`_z((LCa|UD7y8g1r=L4duxAQ^@676N3oe4@MqqEcljO}*Ys$7 zks6U%c#C-Y3!HZFvZDBGw72x16o&-l?;P+lUsAbjqI?QRVqVV=;Z(3!%6 z6TXAE2J@n@qwC_{IK5(GVEah5@9|01-N)Sxfze1*1k&3o5i>9`ARZZwCynPOHYnaF zIxQ|M>Lr>YDxIRipjo#r7Nr#hDB{B-d5!Bu;p$j=Ik6}my0-rlb?uOeRVTS_f8McS zqjY$*GFHD^^DQ0mYmzt1SdZb@J*68mm1p|RZW{=*39X^G@mpW$x1x`AuT1aa5K@9_ zgT_TtdUeB%B`YOyCD-8Og&I~|YV}Y2*Gm&)6Z2_VOUFyUSq9Z#)lUjj8b(yp>$37u zw=;ic%LhVQhAgKKRk5Le#m33dIBA--roMlWIhTl{$e;|=M60u^T{0Ip%^p(x`eO5J zJ$>D7_|4$y;H?k48M~W{d57ym$Xz%4p*3bj@Jyx7k7hYH-R_pv^Nzh3TY#;^+^=D= z;b_y>%XSe5ZQo4gZYz zDNnf&zqhREt$w}u^&;q?z3`&&)@{WV*==EQr{z>5%}7&kZme=ew_@Y#N#03QtNMP2 znxXbvgXfkpCNcFaCAIBRRaWBD#k&Q&{)92++tO+7tM4mvAaNWwDjqCe*xwW5GR-qJ%UQyi$k7uW z|D|T_ZNL5f$~nXfyL?B1HNW`^fr}J;{Imax@3ci9g0efCA;m{x>V?##kW zcw+&FrI(D?@{ZqR)|-(xi~<>t@6Mtw($edO%%w-;MsPDTz}ap-C&FKXk0ZnJL%G$R z`CIDOC$cK5E0>1-Q>yfin;lDkwBO#Trj(mkf^~Vkv%K36Lhhqyh+97->+kzm?lc_P zw(mI9-6yYAyzfBt-n!AfWnQ{y@!oj|zsO54x&b7l=j>%z=qLOwy5ACs=DhmX6Yk)r;+l zSBQ^BT}Lf?)|~bI8jebtTnp}`eQ3N)8ZXcGal36dHyZ5yWj(%^CLpy`(gD7ixvyO*nm%oxv%8nLrdxvs&;!b62wz+&6bb9;WJg+`I zhCV&7vOtN}zn)j0Fw7?z6d;v^K!gpQOich#-t~#R)cs56uj`)B(SK33PX9Mf_6hTJ zfij&?8dTcS#0ko#{%P;uP*c#qQ^kMSgcv#-+Sr@_f$Tc{#p0ObX?7nK%H7E{2ZIo&YE-`2^=tM=H-V8v0Vz;vPCQrwxe;)q} z0d{vZHA4kLDO*(Fe|-QP?Ch-U05iZJG7yNJ3p)7I2VnQ73v9Uml<8Lzdr~3I%8Hn{CdeCv%{~JBFf5^Dl!2g&B7yDBj|6>fO zJys}f`8OFW=Raj^&?x;~kB#G>abbVT&EM>?v;EtAogED=ZA=}1lkv)yp3uC30nAGF z_R!M*6C{_kGqZ>G!EcUPQUuxwydWbMc1|$6nHh-P*bD?VWn(ur<1{nn0JDODrYxpL g{HXu43KZx5joCv>`FA0JKpbEoDkY_;ycp{L0TEYm6#xJL literal 0 HcmV?d00001 diff --git a/plugins/PRAplugin/doc/include/DataClassifier.tex b/plugins/PRAplugin/doc/include/DataClassifier.tex new file mode 100644 index 0000000000..f7fb553e8f --- /dev/null +++ b/plugins/PRAplugin/doc/include/DataClassifier.tex @@ -0,0 +1,12 @@ +\section{Data Classifier} +\label{sec:dataClassifier} + +The \textbf{DataClassifier} post-processor is specifically used to classify the data stored in the DataObjects. +The details about this post-processors can be found in raven user manual subsection \textbf{PostProcessor} +of section \textbf{Models}. + +\subsection{Data Classifier reference tests} +\begin{itemize} + \item test\_dataClassifier\_postprocessor.xml + \item test\_dataClassifier\_postprocessor\_HS.xml +\end{itemize} diff --git a/plugins/PRAplugin/doc/include/ETdataImporter.tex b/plugins/PRAplugin/doc/include/ETdataImporter.tex new file mode 100644 index 0000000000..baedafad77 --- /dev/null +++ b/plugins/PRAplugin/doc/include/ETdataImporter.tex @@ -0,0 +1,19 @@ +\section{ET Data Importer} +\label{sec:ETdataImporter} + +This Post-Processor is designed to import an ET as a PointSet in RAVEN. +The ET must be specified in a specific format: the OpenPSA format (\href{}{https://github.com/open-psa}). +The details about this post-processors can be found in raven user manual subsection \textbf{PostProcessor} +of section \textbf{Models}. + +\subsection{ET Importer reference tests} +\begin{itemize} + \item test\_ETimporter.xml + \item test\_ETimporterMultipleET.xml + \item test\_ETimporterSymbolic.xml + \item test\_ETimporter\_expand.xml + \item test\_ETimporter\_DefineBranch.xml + \item test\_ETimporter\_3branches.xml + \item test\_ETimporter\_3branches\_NewNumbering.xml + \item test\_ETimporter\_3branches\_NewNumbering\_expanded.xml +\end{itemize} diff --git a/plugins/PRAplugin/doc/include/ETmodel.tex b/plugins/PRAplugin/doc/include/ETmodel.tex new file mode 100644 index 0000000000..d36bf03f9a --- /dev/null +++ b/plugins/PRAplugin/doc/include/ETmodel.tex @@ -0,0 +1,95 @@ +\section{ET Model} +\label{sec:ETModel} + +This model is designed to read from file the structure of the ET and to import such Boolean logic structure as a RAVEN model. +The ET must be specified in a specific format: the OpenPSA format (\href{}{https://github.com/open-psa}). +As an example, the ET of Fig.~\ref{fig:ET} is translated in the OpenPSA format as shown below: + +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=ET of Fig.~\ref{fig:ET} in OpenPSA format., label=lst:ETModel] + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +\end{lstlisting} + +\begin{figure} + \centering + \centerline{\includegraphics[scale=0.5]{ET.pdf}} + \caption{Example of ET.} + \label{fig:ET} +\end{figure} + +The ET of Fig.~\ref{fig:ET} and defined in Listing~\ref{lst:ETModel} can be defined in the RAVEN input file as follows: +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=ET model input example., label=lst:ET_InputExample] + + ... + + + statusACC,statusLPI,statusLPR,sequence + + ACC + LPI + LPR + sequence + + ... + +\end{lstlisting} + +All the specifications of the ET model are given in the +\xmlNode{ExternalModel} block. +Inside the \xmlNode{ExternalModel} block, the XML +nodes that belong to this models are: +\begin{itemize} + \item \xmlNode{variables}, \xmlDesc{string, required parameter}, a list containing the names of both the input and output variables of the model + \item \xmlNode{sequenceID},\xmlDesc{string, required parameter}, the name of the alias variable that indicate the branch ID + \item \xmlNode{map},\xmlDesc{string, required parameter}, the name ID of the ET branching variable + \begin{itemize} + \item \xmlAttr{var}, \xmlDesc{required string attribute}, the ALIAS name ID of the ET branching variable + \end{itemize} +\end{itemize} + +Provided this definition and the ET model of Fig.~\ref{fig:ET} and described in Listing~\ref{lst:ETModel}, +the resulting model in RAVEN is characterized by these variables: +\begin{itemize} + \item Input variables: statusACC, statusLPI, statusLPR + \item Output variable: sequence +\end{itemize} + +\subsection{ET model reference tests} +\begin{itemize} + \item test\_ETmodel.xml + \item test\_ETmodel\_TD.xml +\end{itemize} + + + diff --git a/plugins/PRAplugin/doc/include/FTdataImporter.tex b/plugins/PRAplugin/doc/include/FTdataImporter.tex new file mode 100644 index 0000000000..1b7985a1ba --- /dev/null +++ b/plugins/PRAplugin/doc/include/FTdataImporter.tex @@ -0,0 +1,28 @@ +\section{FT Data Importer} +\label{sec:FTdataImporter} + +This Post-Processor is designed to import a FT as a PointSet in RAVEN. +The FT must be specified in a specific format: the OpenPSA format (\href{}{https://github.com/open-psa}). +The details about this post-processors can be found in raven user manual subsection \textbf{PostProcessor} +of section \textbf{Models}. + +\subsection{FT Importer reference tests} +\begin{itemize} + \item test\_FTimporter\_and\_withNOT\_embedded.xml + \item test\_FTimporter\_and\_withNOT\_withNOT\_embedded.xml + \item test\_FTimporter\_and\_withNOT.xml + \item test\_FTimporter\_and.xml + \item test\_FTimporter\_atleast.xml + \item test\_FTimporter\_cardinality.xml + \item test\_FTimporter\_component.xml + \item test\_FTimporter\_doubleNot.xml + \item test\_FTimporter\_iff.xml + \item test\_FTimporter\_imply.xml + \item test\_FTimporter\_multipleFTs.xml + \item test\_FTimporter\_nand.xml + \item test\_FTimporter\_nor.xml + \item test\_FTimporter\_not.xml + \item test\_FTimporter\_or\_houseEvent.xml + \item test\_FTimporter\_or.xml + \item test\_FTimporter\_xor.xml +\end{itemize} diff --git a/plugins/PRAplugin/doc/include/FTmodel.tex b/plugins/PRAplugin/doc/include/FTmodel.tex new file mode 100644 index 0000000000..61b5abe422 --- /dev/null +++ b/plugins/PRAplugin/doc/include/FTmodel.tex @@ -0,0 +1,108 @@ +\section{FT Model} +\label{sec:FTModel} + +This model is designed to read from file the structure of the FT and to import such Boolean logic structure as a RAVEN model. +The FT must be specified in a specific format: the OpenPSA format (\href{}{https://github.com/open-psa}). +As an example, the FT of Fig.~\ref{fig:FT} is translated in the OpenPSA as shown below: + +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=FT in OpenPSA format., label=lst:FTModel] + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +\end{lstlisting} + +\begin{figure} + \centering + \centerline{\includegraphics[scale=0.5]{FT.pdf}} + \caption{Example of FT.} + \label{fig:FT} +\end{figure} + +The FT of Fig.~\ref{fig:FT} and defined in Listing~\ref{lst:FTModel} can be defined in the RAVEN input file as follows: + +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=FT model input example., label=lst:FT_InputExample] + + ... + + + statusBE1,statusBE2,statusBE3,statusBE4,TOP + + TOP + BE1 + BE2 + BE3 + BE4 + + ... + +\end{lstlisting} + +All the specifications of the FT model are given in the +\xmlNode{ExternalModel} block. +Inside the \xmlNode{ExternalModel} block, the XML +nodes that belong to this models are: +\begin{itemize} + \item \xmlNode{variables}, \xmlDesc{string, required parameter}, a list containing the names of both the input and output variables of the model + \item \xmlNode{topEvents},\xmlDesc{string, required parameter}, the name of the alias Top Event + \item \xmlNode{map},\xmlDesc{string, required parameter}, the name ID of the FT basic events + \begin{itemize} + \item \xmlAttr{var}, \xmlDesc{required string attribute}, the ALIAS name ID of the FT basic events + \end{itemize} +\end{itemize} + +Provided this definition, the FT model of Fig.~\ref{fig:ET} and described in Listing~\ref{lst:ETmodel}, +the resulting model in RAVEN is characterized by these variables: +\begin{itemize} + \item Input variables: statusBE1, statusBE2, statusBE3, statusBE4 + \item Output variable: TOP +\end{itemize} + +\subsection{FT model reference tests} +\begin{itemize} + \item test\_FTmodel.xml + \item test\_FTmodel\_TD.xml +\end{itemize} diff --git a/plugins/PRAplugin/doc/include/MarkovModel.tex b/plugins/PRAplugin/doc/include/MarkovModel.tex new file mode 100644 index 0000000000..396b6aef6e --- /dev/null +++ b/plugins/PRAplugin/doc/include/MarkovModel.tex @@ -0,0 +1,68 @@ +\section{Markov Model} +\label{sec:MarkovModel} + +This model is designed to import a generic Markov chain as a RAVEN model. +As an example, the Markov chain of Fig.~\ref{fig:markov} is translated in the OpenPSA as shown below: + +\begin{figure} + \centering + \centerline{\includegraphics[scale=0.5]{markov.pdf}} + \caption{Example of continuous time Markov chain (source Wikipedia: https://en.wikipedia.org/wiki/Markov\_chain).} + \label{fig:markov} +\end{figure} + +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=Markov model input example., label=lst:Markov_InputExample] + + + initialState,finalState + initialState + finalState + 1000 + + 2 + 3 + + + 1 + 3 + + + 1 + 2 + + + +\end{lstlisting} + +All the specifications of the Markov model are given in the \xmlNode{ExternalModel} block. +Inside the \xmlNode{ExternalModel} block, the XML nodes that belong to this model are: +\begin{itemize} + \item \xmlNode{variables}, \xmlDesc{string, required parameter}, a list containing the names of both the input and output variables of the model + \item \xmlNode{initState}, \xmlDesc{string, required parameter}, variable ID corresponding to initial state + \item \xmlNode{finState}, \xmlDesc{string, required parameter}, variable ID corresponding to final state + \item \xmlNode{endTime}, \xmlDesc{float, required parameter}, time horizon to evaluate Markov chain transition history + \item \xmlNode{state}, specifies a single node; inside a \xmlNode{state} all possible transitions OUT of this state must be specified + in the \xmlNode{transition} xml sub-nodes: + \begin{itemize} + \item \xmlAttr{transition}, \xmlDesc{required string attribute}, arrival state + \item \xmlAttr{type}, \xmlDesc{required string attribute}, type of transition. Allowed transition types are: The ET of Fig.~\ref{fig:ET} and defined in Listing~\ref{lst:ETmodel} can be defined in the RAVEN input file as follows: lambda, tau, instant and unif (see below) + \item \xmlAttr{value}, \xmlDesc{required string attribute}, value associated to the particular transition + \end{itemize} +\end{itemize} + +The following transition types are available: +\begin{itemize} + \item lambda: classical continuous time Markov chain transition rate in $\lambda$ form + \item tau: classical continuous time Markov chain transition rate in the $\tau = \frac{1}{\lambda}$ form + \item instant: deterministic transition out of particular state; the exact transition time is provided in input + \item unif: transition time is uniformly sampled between the two provided values in the \xmlAttr{value} node +\end{itemize} + +\subsection{Markov model reference tests} +\begin{itemize} + \item test\_markovModel\_2states\_tau.xml + \item test\_markovModel\_2states.xml + \item test\_markovModel\_3states\_complexTrans.xml + \item test\_markovModel\_3states\_instantTrans.xml + \item test\_markovModel\_3states.xml +\end{itemize} diff --git a/plugins/PRAplugin/doc/include/RBDmodel.tex b/plugins/PRAplugin/doc/include/RBDmodel.tex new file mode 100644 index 0000000000..b8ae59463f --- /dev/null +++ b/plugins/PRAplugin/doc/include/RBDmodel.tex @@ -0,0 +1,103 @@ +\section{RBD Model} +\label{sec:RBDmodel} + +This model is designed to read from file the structure of the RBD and import such Boolean logic structure as a RAVEN model. +The RBD must be specified in a specific format; as an example, the RBD of Fig.~\ref{fig:RBD} is translated in the RAVEN format as shown below: + +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=RBD input file., label=lst:RBDmodel] + + + 1 + + + 2,3,4 + + + 5 + + + 5 + + + 5 + + + 6,7,8 + + + SG1 + + + SG2 + + + SG3 + + + + + + + + +\end{lstlisting} + +\begin{figure} + \centering + \centerline{\includegraphics[scale=0.5]{RBD.pdf}} + \caption{Example of RBD.} + \label{fig:RBD} +\end{figure} + +The FT of Fig.~\ref{fig:RBD} and defined in Listing~\ref{lst:RBDmodel} can be defined in the RAVEN input file as follows: + +\begin{lstlisting}[style=XML,morekeywords={anAttribute},caption=RBD model input example., label=lst:RBD_InputExample] + + ... + + + status2,status3,status4,status5, + statusSG1,statusSG2,statusSG3 + + graphTest + CST + SG1,SG2,SG3 + 2 + 3 + 4 + 5 + SG1 + SG2 + SG3 + + ... + +\end{lstlisting} + +All the specifications of the RBD model are given in the +\xmlNode{ExternalModel} block. +Inside the \xmlNode{ExternalModel} block, the XML +nodes that belong to this models are: +\begin{itemize} + \item \xmlNode{variables}, \xmlDesc{string, required parameter}, a list containing the names of both the input and output variables of the model + \item \xmlNode{modelFile},\xmlDesc{string, required parameter}, the name of the file that provide the RBD structure + \item \xmlNode{nodesIN},\xmlDesc{string, required parameter}, the name of the input nodes + \item \xmlNode{nodesOUT},\xmlDesc{string, required parameter}, the name of the output nodes + \item \xmlNode{map},\xmlDesc{string, required parameter}, the name ID of the RBD node + \begin{itemize} + \item \xmlAttr{var}, \xmlDesc{required string attribute}, the ALIAS name ID of the RBD node + \end{itemize} +\end{itemize} + +Provided this definition, the RBD model of Fig.~\ref{fig:RBD} and described in Listing~\ref{lst:RBDmodel}, +the resulting model in RAVEN is characterized by these variables: +\begin{itemize} + \item Input variables: status2, status3, status4, status5 + \item Output variable: statusSG1, statusSG2, statusSG3 +\end{itemize} + +\subsection{RBD model reference tests} +\begin{itemize} + \item test\_graphModel.xml + \item test\_graphModel\_TD.xml +\end{itemize} diff --git a/plugins/PRAplugin/doc/make_pra_plugin_docs.sh b/plugins/PRAplugin/doc/make_pra_plugin_docs.sh new file mode 100755 index 0000000000..0a5f5b9f4a --- /dev/null +++ b/plugins/PRAplugin/doc/make_pra_plugin_docs.sh @@ -0,0 +1,39 @@ + +VERB=0 +for i in "$@" +do + if [[ $i == "--verbose" ]] + then + VERB=1 + echo Entering verbose mode... + fi +done + +if git describe +then + git describe | sed 's/_/\\_/g' > new_version.tex + echo "\\\\" >> new_version.tex + git log -1 --format="%H %an\\\\%aD" . >> new_version.tex + if diff new_version.tex version.tex + then + echo No change in version.tex + else + mv new_version.tex version.tex + fi +fi + +echo Building +if [[ 1 -eq $VERB ]] +then + make; MADE=$? +else + make > /dev/null; MADE=$? +fi + +if [[ 0 -eq $MADE ]] +then + echo ...Successfully made docs +else + echo ...Failed to make docs + exit -1 +fi diff --git a/plugins/PRAplugin/doc/user_manual.bib b/plugins/PRAplugin/doc/user_manual.bib new file mode 100644 index 0000000000..37fd345b2b --- /dev/null +++ b/plugins/PRAplugin/doc/user_manual.bib @@ -0,0 +1,14 @@ + +@techreport{RAVEN, + Author = {C. Rabiti and A. Alfonsi and J. Cogliati and D. Mandelli and R. Kinoshita and S. Sen and C. Wang and J. Chen}, + Publisher = {Idaho National laboratory (INL)}, + Number = {INL/EXT-15-34123}, + Source = {https://raven.inl.gov}, + Title = {RAVEN user manual}, + Year = {March 2017}} +@techreport{RAVENtheoryMan, + Author = {A. Alfonsi and C. Rabiti and D. Mandelli and J. Cogliati and C. Wang and P. W. Talbot and D. P. Maljovec and C. Smith}, + title = {RAVEN Theory Manual and User Guide}, + institution = {Idaho National Laboratory (INL)}, + booktitle = {INL/EXT-16-38178}, + year = {2017}} diff --git a/plugins/PRAplugin/doc/user_manual.tex b/plugins/PRAplugin/doc/user_manual.tex new file mode 100644 index 0000000000..089d168f1b --- /dev/null +++ b/plugins/PRAplugin/doc/user_manual.tex @@ -0,0 +1,263 @@ +% +\documentclass[pdf,12pt]{article} + +%\usepackage{times} +%\usepackage[FIGBOTCAP,normal,bf,tight]{subfigure} +\usepackage{amsmath} +\usepackage{amssymb} +%\usepackage{pifont} +\usepackage{enumerate} +\usepackage{listings} +\usepackage{fullpage} +\usepackage{xcolor} % Using xcolor for more robust color specification +%\usepackage{ifthen} % For simple checking in newcommand blocks +%\usepackage{textcomp} +%\usepackage{authblk} % For making the author list look prettier +%\renewcommand\Authsep{,~\,} + +% Custom colors +\definecolor{deepblue}{rgb}{0,0,0.5} +\definecolor{deepred}{rgb}{0.6,0,0} +\definecolor{deepgreen}{rgb}{0,0.5,0} +\definecolor{forestgreen}{RGB}{34,139,34} +\definecolor{orangered}{RGB}{239,134,64} +\definecolor{darkblue}{rgb}{0.0,0.0,0.6} +\definecolor{gray}{rgb}{0.4,0.4,0.4} + +\lstset { + basicstyle=\ttfamily, + frame=single +} + +\lstdefinestyle{XML} { + language=XML, + extendedchars=true, + breaklines=true, + breakatwhitespace=true, +% emph={name,dim,interactive,overwrite}, + emphstyle=\color{red}, + basicstyle=\ttfamily, +% columns=fullflexible, + commentstyle=\color{gray}\upshape, + morestring=[b]", + morecomment=[s]{}, + morecomment=[s][\color{forestgreen}]{}, + keywordstyle=\color{cyan}, + stringstyle=\ttfamily\color{black}, + tagstyle=\color{darkblue}\bf\ttfamily, + morekeywords={name,type}, +% morekeywords={name,attribute,source,variables,version,type,release,x,z,y,xlabel,ylabel,how,text,param1,param2,color,label}, +} +\lstset{language=xml} + +\usepackage{titlesec} +\newcommand{\sectionbreak}{\clearpage} +\setcounter{secnumdepth}{4} + + +%%%%%%%% Begin comands definition to input python code into document +\usepackage[utf8]{inputenc} + +% Default fixed font does not support bold face +\DeclareFixedFont{\ttb}{T1}{txtt}{bx}{n}{9} % for bold +\DeclareFixedFont{\ttm}{T1}{txtt}{m}{n}{9} % for normal + +\usepackage{listings} + +% Python style for highlighting +%\newcommand\pythonstyle{\lstset{ +%language=Python, +%basicstyle=\ttm, +%otherkeywords={self, none, return}, % Add keywords here +%keywordstyle=\ttb\color{deepblue}, +%emph={MyClass,__init__}, % Custom highlighting +%emphstyle=\ttb\color{deepred}, % Custom highlighting style +%stringstyle=\color{deepgreen}, +%frame=tb, % Any extra options here +%showstringspaces=false % +%}} + + +% Python environment +%\lstnewenvironment{python}[1][] +%{ +%$\pythonstyle +%\lstset{#1} +%} +%{} + +% Python for external files +%\newcommand\pythonexternal[2][]{{ +%\pythonstyle +%\lstinputlisting[#1]{#2}}} +% +%\lstnewenvironment{xml} +%{} +%{} + +% Python for inline +%\newcommand\pythoninline[1]{{\pythonstyle\lstinline!#1!}} + +%\def\DRAFT{} % Uncomment this if you want to see the notes people have been adding +% Comment command for developers (Should only be used under active development) +%\ifdefined\DRAFT +% \newcommand{\nameLabeler}[3]{\textcolor{#2}{[[#1: #3]]}} +%\else +% \newcommand{\nameLabeler}[3]{} +%\fi +% Commands for making the LaTeX a bit more uniform and cleaner +%\newcommand{\TODO}[1] {\textcolor{red}{\textit{(#1)}}} +\newcommand{\xmlAttrRequired}[1] {\textcolor{red}{\textbf{\texttt{#1}}}} +\newcommand{\xmlAttr}[1] {\textcolor{cyan}{\textbf{\texttt{#1}}}} +\newcommand{\xmlNodeRequired}[1] {\textcolor{deepblue}{\textbf{\texttt{<#1>}}}} +\newcommand{\xmlNode}[1] {\textcolor{darkblue}{\textbf{\texttt{<#1>}}}} +\newcommand{\xmlString}[1] {\textcolor{black}{\textbf{\texttt{'#1'}}}} +\newcommand{\xmlDesc}[1] {\textbf{\textit{#1}}} % Maybe a misnomer, but I am + % using this to detail the data + % type and necessity of an XML + % node or attribute, + % xmlDesc = XML description +\newcommand{\default}[1]{~\\*\textit{Default: #1}} +\newcommand{\nb} {\textcolor{deepgreen}{\textbf{~Note:}}~} + +% The bm package provides \bm for bold math fonts. Apparently +% \boldsymbol, which I used to always use, is now considered +% obsolete. Also, \boldsymbol doesn't even seem to work with +% the fonts used in this particular document... +\usepackage{bm} + +% Define tensors to be in bold math font. +\newcommand{\tensor}[1]{{\bm{#1}}} + +% Override the formatting used by \vec. Instead of a little arrow +% over the letter, this creates a bold character. +\renewcommand{\vec}{\bm} + +% Define unit vector notation. If you don't override the +% behavior of \vec, you probably want to use the second one. +\newcommand{\unit}[1]{\hat{\bm{#1}}} + +% Use this to refer to a single component of a unit vector. +\newcommand{\scalarunit}[1]{\hat{#1}} + +% \toprule, \midrule, \bottomrule for tables +\usepackage{booktabs} + +% \llbracket, \rrbracket +\usepackage{stmaryrd} + +\usepackage{hyperref} + +\usepackage{graphicx} +\graphicspath{{./figures/}} + +% Compress lists of citations like [33,34,35,36,37] to [33-37] +\usepackage{cite} + +% If you want to relax some of the SAND98-0730 requirements, use the "relax" +% option. It adds spaces and boldface in the table of contents, and does not +% force the page layout sizes. +% e.g. \documentclass[relax,12pt]{SANDreport} +% +% You can also use the "strict" option, which applies even more of the +% SAND98-0730 guidelines. It gets rid of section numbers which are often +% useful; e.g. \documentclass[strict]{SANDreport} + +% The INLreport class uses \flushbottom formatting by default (since +% it's intended to be two-sided document). \flushbottom causes +% additional space to be inserted both before and after paragraphs so +% that no matter how much text is actually available, it fills up the +% page from top to bottom. My feeling is that \raggedbottom looks much +% better, primarily because most people will view the report +% electronically and not in a two-sided printed format where some argue +% \raggedbottom looks worse. If we really want to have the original +% behavior, we can comment out this line... +\raggedbottom +\setcounter{secnumdepth}{5} % show 5 levels of subsection +\setcounter{tocdepth}{5} % include 5 levels of subsection in table of contents + +% ---------------------------------------------------------------------------- % +% +% Set the title, author, and date +% +\title{The RAVEN PRA Plugin \\ - User Manual -} +%\author{% +%\begin{tabular}{c} Author 1 \\ University1 \\ Mail1 \\ \\ +%Author 3 \\ University3 \\ Mail3 \end{tabular} \and +%\begin{tabular}{c} Author 2 \\ University2 \\ Mail2 \\ \\ +%Author 4 \\ University4 \\ Mail4\\ +%\end{tabular} } + + +\author{D. Mandelli, C. Wang, A. Alfonsi} + +% There is a "Printed" date on the title page of a SAND report, so +% the generic \date should [WorkingDir:]generally be empty. +\date{\today} + +%\def\component#1{\texttt{#1}} + +% ---------------------------------------------------------------------------- % +%\newcommand{\systemtau}{\tensor{\tau}_{\!\text{SUPG}}} + +% Added by Sonat +%\usepackage{placeins} +%\usepackage{array} + +%\newcolumntype{L}[1]{>{\raggedright\let\newline\\\arraybackslash\hspace{0pt}}m{#1}} +%\newcolumntype{C}[1]{>{\centering\let\newline\\\arraybackslash\hspace{0pt}}m{#1}} +%\newcolumntype{R}[1]{>{\raggedleft\let\newline\\\arraybackslash\hspace{0pt}}m{#1}} + +% end added by Sonat +% ---------------------------------------------------------------------------- % +% +% Start the document +% + +\begin{document} + \maketitle + + % ------------------------------------------------------------------------ % + % The table of contents and list of figures and tables + % Comment out \listoffigures and \listoftables if there are no + % figures or tables. Make sure this starts on an odd numbered page + % + \cleardoublepage % TOC needs to start on an odd page + \tableofcontents + %\listoffigures + %\listoftables + % ---------------------------------------------------------------------- % + + % ---------------------------------------------------------------------- % + % This is where the body of the report begins; usually with an Introduction + % + \input{Introduction.tex} + \input{include/ETmodel.tex} + \input{include/FTmodel.tex} + \input{include/MarkovModel.tex} + \input{include/RBDmodel.tex} + \input{include/DataClassifier.tex} + \input{include/ETdataImporter.tex} + \input{include/FTdataImporter.tex} + + \section*{Document Version Information} + This document has been compiled using the following version of the plug-in git repository: + \newline + \input{version.tex} + + % ---------------------------------------------------------------------- % + % References + % + \clearpage + % If hyperref is included, then \phantomsection is already defined. + % If not, we need to define it. + \providecommand*{\phantomsection}{} + \phantomsection + \addcontentsline{toc}{section}{References} + \bibliographystyle{ieeetr} + \bibliography{user_manual} + + + % ---------------------------------------------------------------------- % + +\end{document} diff --git a/plugins/PRAplugin/src/ETModel.py b/plugins/PRAplugin/src/ETModel.py new file mode 100644 index 0000000000..771c6f27ed --- /dev/null +++ b/plugins/PRAplugin/src/ETModel.py @@ -0,0 +1,105 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on April 30, 2018 + +@author: mandd +""" + +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#External Modules--------------------------------------------------------------- +#External Modules End----------------------------------------------------------- + +#Internal Modules--------------------------------------------------------------- +from PluginsBaseClasses.ExternalModelPluginBase import ExternalModelPluginBase +from PostProcessors.ETStructure import ETStructure +#Internal Modules End----------------------------------------------------------- + + +class ETModel(ExternalModelPluginBase): + """ + This class is designed to create an Event-Tree model + """ + def __init__(self): + """ + Constructor + @ In, None + @ Out, None + """ + ExternalModelPluginBase.__init__(self) + + def _readMoreXML(self, container, xmlNode): + """ + Method to read the portion of the XML that belongs to the Event-Tree model + @ In, container, object, self-like object where all the variables can be stored + @ In, xmlNode, xml.etree.ElementTree.Element, XML node that needs to be read + @ Out, None + """ + container.mapping = {} + container.InvMapping = {} + + for child in xmlNode: + if child.tag == 'topEvents': + container.topEventID = child.text.strip() + elif child.tag == 'map': + container.mapping[child.get('var')] = child.text.strip() + container.InvMapping[child.text.strip()] = child.get('var') + elif child.tag == 'variables': + variables = [str(var.strip()) for var in child.text.split(",")] + elif child.tag == 'sequenceID': + container.sequenceID = child.text.strip() + else: + raise IOError("ETModel: xml node " + str (child.tag) + " is not allowed") + + def initialize(self, container, runInfoDict, inputFiles): + """ + Method to initialize the Event-Tree model + @ In, container, object, self-like object where all the variables can be stored + @ In, runInfoDict, dict, dictionary containing all the RunInfo parameters (XML node ) + @ In, inputFiles, list, list of input files (if any) + @ Out, None + """ + pass + + def createNewInput(self, container, inputs, samplerType, **Kwargs): + """ + This function has been added for this model in order to be able to create an ETStructure from multiple files + @ In, container, object, self-like object where all the variables can be stored + @ In, myInput, list, the inputs (list) to start from to generate the new one + @ In, samplerType, string, is the type of sampler that is calling to generate a new input + @ In, **kwargs, dict, is a dictionary that contains the information coming from the sampler, + a mandatory key is the sampledVars'that contains a dictionary {'name variable':value} + @ Out, ([(inputDict)],copy.deepcopy(kwargs)), tuple, return the new input in a tuple form + """ + container.eventTreeModel = ETStructure(inputs=inputs, expand=True) + return Kwargs + + def run(self, container, Inputs): + """ + This method provides the sequence of the ET given the status of its branching conditions + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + """ + inputForET = {} + for key in container.InvMapping.keys(): + if Inputs[container.InvMapping[key]] > 0: + inputForET[key] = 1.0 + else: + inputForET[key] = 0.0 + + value = container.eventTreeModel.solve(inputForET) + container.__dict__[container.sequenceID]= value diff --git a/plugins/PRAplugin/src/FTModel.py b/plugins/PRAplugin/src/FTModel.py new file mode 100644 index 0000000000..47430b95a6 --- /dev/null +++ b/plugins/PRAplugin/src/FTModel.py @@ -0,0 +1,188 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on April 30, 2018 + +@author: mandd +""" + +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#External Modules--------------------------------------------------------------- +import numpy as np +#External Modules End----------------------------------------------------------- + +#Internal Modules--------------------------------------------------------------- +from PluginsBaseClasses.ExternalModelPluginBase import ExternalModelPluginBase +from PostProcessors.FTStructure import FTStructure +#Internal Modules End----------------------------------------------------------- + + +class FTModel(ExternalModelPluginBase): + """ + This class is designed to create a Fault-Tree model + """ + def __init__(self): + """ + Constructor + @ In, None + @ Out, None + """ + ExternalModelPluginBase.__init__(self) + + def _readMoreXML(self, container, xmlNode): + """ + Method to read the portion of the XML that belongs to the Fault-Tree model + @ In, container, object, self-like object where all the variables can be stored + @ In, xmlNode, xml.etree.ElementTree.Element, XML node that needs to be read + @ Out, None + """ + container.mapping = {} + container.InvMapping = {} + + for child in xmlNode: + if child.tag == 'topEvents': + container.topEventID = child.text.strip() + elif child.tag == 'map': + container.mapping[child.get('var')] = child.text.strip() + container.InvMapping[child.text.strip()] = child.get('var') + elif child.tag == 'variables': + variables = [str(var.strip()) for var in child.text.split(",")] + else: + raise IOError("FTModel: xml node " + str (child.tag) + " is not allowed") + + def initialize(self, container, runInfoDict, inputFiles): + """ + Method to initialize this plugin + @ In, container, object, self-like object where all the variables can be stored + @ In, runInfoDict, dict, dictionary containing all the RunInfo parameters (XML node ) + @ In, inputFiles, list, list of input files (if any) + @ Out, None + """ + pass + + def createNewInput(self, container, inputs, samplerType, **Kwargs): + """ + This function has been added for this model in order to be able to create a FTstructure from multiple files + @ In, myInput, list, the inputs (list) to start from to generate the new one + @ In, samplerType, string, is the type of sampler that is calling to generate a new input + @ In, **kwargs, dict, is a dictionary that contains the information coming from the sampler, + a mandatory key is the sampledVars'that contains a dictionary {'name variable':value} + @ Out, ([(inputDict)],copy.deepcopy(kwargs)), tuple, return the new input in a tuple form + """ + container.faultTreeModel = FTStructure(inputs, container.topEventID) + container.faultTreeModel.FTsolver() + return Kwargs + + def run(self, container, Inputs): + """ + This method determines the status of the TopEvent of the FT provided the status of its Basic Events + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + """ + if self.checkTypeOfAnalysis(container,Inputs): + value = self.runTimeDep(container, Inputs) + else: + value = self.runStatic(container, Inputs) + + container.__dict__[container.topEventID]= value[container.topEventID] + + def checkTypeOfAnalysis(self,container,Inputs): + """ + This method checks which type of analysis to be performed: + - True: dynamic (time dependent) + - False: static + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + @ Out, analysisType, bool, type of analysis to be performed + + """ + arrayValues=set() + for key in Inputs.keys(): + if key in container.mapping.keys(): + arrayValues.add(Inputs[key]) + analysisType = None + if arrayValues.difference({0.,1.}): + analysisType = True + else: + analysisType = False + return analysisType + + def runStatic(self, container, Inputs): + """ + This method performs a static analysis of the FT model + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + @ Out, value, float, value of the Tope Event of the FT + """ + + inputForFT = {} + for key in container.InvMapping.keys(): + inputForFT[key] = Inputs[container.InvMapping[key]] + value = container.faultTreeModel.evaluateFT(inputForFT) + return value + + def runTimeDep(self, container, Inputs): + """ + This method performs a dynamic analysis of the FT model + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + @ Out, outcome, dict, time depedendnt value of the Tope Event of the FT + """ + times = [] + times.append(0.) + for key in Inputs.keys(): + if key in container.mapping.keys() and Inputs[key]!=1.: + times.append(Inputs[key]) + times = sorted(times, key=float) + + outcome={} + outcome[container.topEventID] = np.asarray([0.]) + + for time in times: + inputToPass=self.inputToBePassed(container,time,Inputs) + tempOut = self.runStatic(container, inputToPass) + for var in outcome.keys(): + if tempOut[var] == 1.: + if time == 0.: + outcome[var] = np.asarray([1.]) + else: + if outcome[var][0] <= 0: + outcome[var] = np.asarray([time]) + return outcome + + def inputToBePassed(self,container,time,Inputs): + """ + This method return the status of the input variables at time t=time + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + @ In, time, float, time at which the input variables need to be evaluated + @ Out, inputToBePassed, dict, value of the FT basic events at t=time + """ + inputToBePassed = {} + for key in Inputs.keys(): + if key in container.mapping.keys(): + if Inputs[key] == 0. or Inputs[key] == 1.: + inputToBePassed[key] = Inputs[key] + else: + if Inputs[key] > time: + inputToBePassed[key] = np.asarray([0.]) + else: + inputToBePassed[key] = np.asarray([1.]) + return inputToBePassed + + + diff --git a/plugins/PRAplugin/src/GraphModel.py b/plugins/PRAplugin/src/GraphModel.py new file mode 100644 index 0000000000..a4c7f5afab --- /dev/null +++ b/plugins/PRAplugin/src/GraphModel.py @@ -0,0 +1,246 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on April 30, 2018 + +@author: mandd +""" + +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#External Modules--------------------------------------------------------------- +import numpy as np +import xml.etree.ElementTree as ET +from utils import utils +from utils import graphStructure as GS +import copy +from utils import xmlUtils as xmlU +#External Modules End----------------------------------------------------------- + +#Internal Modules--------------------------------------------------------------- +from PluginsBaseClasses.ExternalModelPluginBase import ExternalModelPluginBase +#Internal Modules End----------------------------------------------------------- + + +class GraphModel(ExternalModelPluginBase): + """ + This class is designed to create a directed graph model which is employed to model Reliability Block Diagrams + """ + def __init__(self): + """ + Constructor + @ In, None + @ Out, None + """ + ExternalModelPluginBase.__init__(self) + + def _readMoreXML(self, container, xmlNode): + """ + Method to read the portion of the XML that belongs to GraphModel + @ In, container, object, self-like object where all the variables can be stored + @ In, xmlNode, xml.etree.ElementTree.Element, XML node that needs to be read + @ Out, None + """ + container.modelFile = None # file name containing the RBD structure + container.nodesIN = None # ID of the RBD input nodes + container.nodesOUT = None # ID of the RBD output nodes + + container.mapping = {} # Mapping dictionary for input variables + container.InvMapping = {} # Inverse mapping dictionary for input variables + + for child in xmlNode: + if child.tag == 'nodesIN': + container.nodesIN = [str(var.strip()) for var in child.text.split(",")] + elif child.tag == 'nodesOUT': + container.nodesOUT = [str(var.strip()) for var in child.text.split(",")] + elif child.tag == 'modelFile': + container.modelFile = child.text.strip() + '.xml' + elif child.tag == 'map': + container.mapping[child.get('var')] = child.text.strip() + container.InvMapping[child.text.strip()] = child.get('var') + elif child.tag == 'variables': + variables = [str(var.strip()) for var in child.text.split(",")] + else: + print('xml error') + + if container.nodesIN is None: + raise IOError("GraphModel: XML block is not specified") + if container.nodesOUT is None: + raise IOError("GraphModel: XML block is not specified") + if container.modelFile is None: + raise IOError("GraphModel: XML block is not specified") + + if set(variables) != set(container.mapping.keys()): + raise IOError("GraphModel: the set of variables specified in the " + str(set(variables)) + " XML block does not match with the specified mapping" + str(set(container.mapping.keys()))) + if not set(container.nodesOUT) <= set(container.mapping.values()): + raise IOError("GraphModel: the set of out variables specified in the " + str(set(variables)) + " XML block does not match with the specified mapping" + str(set(container.mapping.values()))) + + def initialize(self, container,runInfoDict,inputFiles): + """ + Method to initialize the GraphModel + @ In, container, object, self-like object where all the variables can be stored + @ In, runInfoDict, dict, dictionary containing all the RunInfo parameters (XML node ) + @ In, inputFiles, list, list of input files (if any) + @ Out, None + """ + container.nodes = {} + container.deg = {} + + container.runInfo = runInfoDict + self.createGraph(container,container.modelFile) + + def createGraph(self,container,file): + """ + Method that actually creates from file the graph structure of the model + @ In, container, object, self-like object where all the variables can be stored + @ In, file, file, file containing the structure of the model + @ Out, None + """ + graph = ET.parse(container.runInfo['WorkingDir'] + '/' + file) + graph = xmlU.findAllRecursive(graph,'Graph') + + for node in xmlU.findAllRecursive(graph[0], 'node'): + nodeName = node.get('name') + nodeChilds = [] + deg = None + for child in node: + if child.tag == 'childs': + nodeChilds = [var.strip() for var in child.text.split(",")] + if child.tag == 'deg': + deg = float(child.text) + container.nodes[nodeName] = nodeChilds + container.deg[nodeName] = deg + + def run(self, container, Inputs): + """ + This method computes all possible path from the input to the output nodes + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + + """ + if self.checkTypeOfAnalysis(container,Inputs): + dictOUT = self.runTimeDep(container, Inputs) + else: + dictOUT = self.runStatic(container, Inputs) + + for var in dictOUT.keys(): + container.__dict__[var] = dictOUT[var] + + def checkTypeOfAnalysis(self,container,Inputs): + """ + This method checks which type of analysis to be performed: + - True: dynamic (time dependent) + - False: static + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + @ Out, analysisType, bool, type of analysis to be performed + + """ + arrayValues=set() + for key in Inputs.keys(): + if key in container.mapping.keys(): + arrayValues.add(Inputs[key][0]) + analysisType = None + if arrayValues.difference({0.,1.}): + analysisType = True + else: + analysisType = False + return analysisType + + def runStatic(self, container, Inputs): + """ + This method performs a static analysis of the graph model + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + @ Out, dictOut, dict, dictionary containing the status of all output variables + """ + mapping = copy.deepcopy(container.mapping) + nodes = copy.deepcopy(container.nodes) + + for key in Inputs.keys(): + if key in mapping.keys(): + if mapping[key] in nodes.keys() and Inputs[key][0] == 1.0: + nodes.pop(mapping[key],None) + for node in nodes.keys(): + if mapping[key] in nodes[node]: + nodes[node].remove(mapping[key]) + + ravenGraph = GS.graphObject(nodes) + + dictOut = {} + for nodeO in container.nodesOUT: + paths = [] + for nodeI in container.nodesIN: + paths = paths + ravenGraph.findAllPaths(nodeI,nodeO) + var = container.InvMapping[nodeO] + if paths: + dictOut[var] = np.asarray(0.) + else: + dictOut[var] = np.asarray(1.) + return dictOut + + def runTimeDep(self, container, Inputs): + """ + This method performs a dynamic analysis of the graph model + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + @ Out, outcome, dict, dictionary containing the temporal status of all output variables + """ + times = [] + times.append(0.) + for key in Inputs.keys(): + if key in container.mapping.keys() and Inputs[key][0]!=1.: + times.append(Inputs[key][0]) + times = sorted(times, key=float) + + outcome={} + for var in container.nodesOUT: + outcome[container.InvMapping[var]] = np.asarray([0.]) + + for time in times: + inputToPass=self.inputToBePassed(container,time,Inputs) + tempOut = self.runStatic(container, inputToPass) + for var in tempOut.keys(): + if tempOut[var] == 1.: + if time == 0.: + outcome[var] = np.asarray([1.]) + else: + if outcome[var][0] <= 0: + outcome[var] = np.asarray([time]) + return outcome + + def inputToBePassed(self,container,time,Inputs): + """ + This method returns the status of the input variables at time t=time + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + @ In, time, float, time at which the input variables need to be evaluated + @ Out, inputToBePassed, dict, value of the RBD nodes at t=time + """ + inputToBePassed = {} + for key in Inputs.keys(): + if key in container.mapping.keys(): + if Inputs[key][0] == 0. or Inputs[key][0] == 1.: + inputToBePassed[key] = Inputs[key] + else: + if Inputs[key][0] > time: + inputToBePassed[key] = np.asarray([0.]) + else: + inputToBePassed[key] = np.asarray([1.]) + return inputToBePassed + + + diff --git a/plugins/PRAplugin/src/MarkovModel.py b/plugins/PRAplugin/src/MarkovModel.py new file mode 100644 index 0000000000..26edc95fdf --- /dev/null +++ b/plugins/PRAplugin/src/MarkovModel.py @@ -0,0 +1,201 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Created on April 30, 2018 + +@author: mandd +""" + +from __future__ import division, print_function , unicode_literals, absolute_import +import warnings +warnings.simplefilter('default', DeprecationWarning) + +#External Modules--------------------------------------------------------------- +import numpy as np +import math +import sys +import copy +from operator import mul +#External Modules End----------------------------------------------------------- + +#Internal Modules--------------------------------------------------------------- +from PluginsBaseClasses.ExternalModelPluginBase import ExternalModelPluginBase +from utils.randomUtils import newRNG + +#Internal Modules End----------------------------------------------------------- + + +class MarkovModel(ExternalModelPluginBase): + """ + This class is designed to create a Markov model + """ + def __init__(self): + """ + Constructor + @ In, None + @ Out, None + """ + ExternalModelPluginBase.__init__(self) + self.randomEngine = None # a instance of random number generator + + def _readMoreXML(self, container, xmlNode): + """ + Method to read the portion of the XML that belongs to the Markov Model + @ In, container, object, self-like object where all the variables can be stored + @ In, xmlNode, xml.etree.ElementTree.Element, XML node that needs to be read + @ Out, None + """ + container.initState = None # Markov Model initial state + container.finState = None # Markov Model final state + container.seed = None # Markov Model seed number + container.states = {} # Markov Model dictionary of states + + for child in xmlNode: + if child.tag == 'initState': + container.initState = child.text.strip() + if child.tag == 'finState': + container.finState = child.text.strip() + if child.tag == 'seed': + container.seed = float(child.text.strip()) + elif child.tag == 'endTime': + container.endTime = float(child.text.strip()) + elif child.tag == 'state': + container.states[child.get('name')] = {} + for childChild in child: + if childChild.tag == 'transition': + if childChild.get('type') == 'lambda': + value = float(childChild.get('value')) + elif childChild.get('type') == 'tau': + value = 1. / float(childChild.get('value')) + elif childChild.get('type') == 'instant': + value = [float(childChild.get('value'))] + elif childChild.get('type') == 'unif': + value = [float(var.strip()) for var in childChild.get('value').split(",")] + else: + raise IOError("MarkovModel: transition " + str (childChild.get('type')) + " is not allowed") + container.states[child.get('name')][childChild.text.strip()] = value + else: + raise IOError("MarkovModel: xml node " + str (childChild.tag) + " is not allowed") + + statesIDs = container.states.keys() + for state in container.states: + transitions = container.states[state].keys() + if not set(transitions).issubset(set(statesIDs)): + raise IOError("MarkovModel: the set of transtions " + str (set(transitions)) + " out of state " + str(state) + " lead to not defined states") + if container.initState is None: + raise IOError("MarkovModel: XML block is not specified") + if container.finState is None: + raise IOError("MarkovModel: XML block is not specified") + + def initialize(self, container,runInfoDict,inputFiles): + """ + Method to initialize this Markov model + @ In, container, object, self-like object where all the variables can be stored + @ In, runInfoDict, dict, dictionary containing all the RunInfo parameters (XML node ) + @ In, inputFiles, list, list of input files (if any) + @ Out, None + """ + self.randomEngine = newRNG(env='numpy') + if container.seed is not None: + self.randomEngine.seed(container.seed) + else: + self.randomEngine.seed(250678) + + def run(self, container, Inputs): + """ + This method computes all the final state at the end of the specified time + @ In, container, object, self-like object where all the variables can be stored + @ In, Inputs, dict, dictionary of inputs from RAVEN + + """ + time = 0. + actualState = str(int(Inputs[container.initState][0])) + while True: + transitionDict = copy.deepcopy(container.states[actualState]) + transitionTime , newState = self.newState(transitionDict) + time += transitionTime + if time >= container.endTime: + break + else: + actualState = newState + + container.__dict__[container.finState] = np.asarray(actualState) + + def newState(self,dictIn): + """ + Method which calculates the next transition out of a state + @ In, dictIn, dict, dictionary containing all possible transitions out of a state + @ Out, detTransitionTime, float, time of the next transition + @ Out, detState, float, arrival state for the next transition + """ + detTrans = {} + stochTrans = {} + detTransitionTime = sys.float_info.max + stochTransitionTime = sys.float_info.max + + for key in dictIn.keys(): + if type(dictIn[key]) is list: + detTrans[key] = copy.deepcopy(dictIn[key]) + else: + stochTrans[key] = copy.deepcopy(dictIn[key]) + if detTrans: + detTransitionTime, detState = self.detNewState(detTrans) + if stochTrans: + stochTransitionTime, stochState = self.stochNewState(stochTrans) + if stochTransitionTime < detTransitionTime: + return stochTransitionTime, stochState + else: + return detTransitionTime, detState + + def detNewState(self,detTrans): + """ + Method which calculates the next transition out of a state for a deterministic transition + @ In, dictIn, dict, dictionary containing all possible transitions out of a state + @ Out, detTransitionTime, float, time of the next transition + @ Out, detTransitionState, float, arrival state for the next transition + """ + detTransitionTime = sys.float_info.max + detTransitionState = None + for key in detTrans.keys(): + if len(detTrans[key]) == 1: + time = detTrans[key][0] + if time + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/ETmodelTD/eventTree.xml b/plugins/PRAplugin/tests/ETmodelTD/eventTree.xml new file mode 100644 index 0000000000..b9c5c08e63 --- /dev/null +++ b/plugins/PRAplugin/tests/ETmodelTD/eventTree.xml @@ -0,0 +1,33 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/FTmodel/FT1.xml b/plugins/PRAplugin/tests/FTmodel/FT1.xml new file mode 100644 index 0000000000..9b91abd5fd --- /dev/null +++ b/plugins/PRAplugin/tests/FTmodel/FT1.xml @@ -0,0 +1,58 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/FTmodelTD/FT1.xml b/plugins/PRAplugin/tests/FTmodelTD/FT1.xml new file mode 100644 index 0000000000..27e33f3488 --- /dev/null +++ b/plugins/PRAplugin/tests/FTmodelTD/FT1.xml @@ -0,0 +1,22 @@ + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/dataClassifier/THmodel.py b/plugins/PRAplugin/tests/dataClassifier/THmodel.py new file mode 100644 index 0000000000..9626190d81 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifier/THmodel.py @@ -0,0 +1,51 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import numpy as np + +def run(self,Input): + """ + This method computes the final status of this simplified LB LOCA event + @ In, Input, dict, dictionary of inputs from RAVEN + @ Out, None + """ + self.ACCstatus = Input['ACC_status'][0] + self.timeLPI = Input['time_LPI'][0] + self.timeLPR = Input['time_LPR'][0] + + timeToCDLPI = 6. + timeToCDLPR = 11. + + if self.ACCstatus == 1.: + self.out = 1. + self.LPI_status = 1. + self.LPR_status = 1. + else: + self.LPIact = self.timeLPI + 1. + if self.LPIact > timeToCDLPI: + self.out = 1. + self.LPI_status = 1. + self.LPR_status = 1. + else: + self.LPI_status = 0. + self.LPRact = self.LPIact + self.timeLPR + if self.LPRact > timeToCDLPR: + self.out = 1. + self.LPR_status = 1. + else: + self.out = 0. + self.LPR_status = 0. + + + diff --git a/plugins/PRAplugin/tests/dataClassifier/eventTree.xml b/plugins/PRAplugin/tests/dataClassifier/eventTree.xml new file mode 100644 index 0000000000..b9c5c08e63 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifier/eventTree.xml @@ -0,0 +1,33 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/dataClassifier/func_ACC.py b/plugins/PRAplugin/tests/dataClassifier/func_ACC.py new file mode 100644 index 0000000000..66410757b1 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifier/func_ACC.py @@ -0,0 +1,21 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +def evaluate(self): + """ + This method returns the variable ACC_status + @ In, None + @ Out, ACC_status, float, accumulator system status + """ + return self.ACC_status diff --git a/plugins/PRAplugin/tests/dataClassifier/func_LPI.py b/plugins/PRAplugin/tests/dataClassifier/func_LPI.py new file mode 100644 index 0000000000..8ebeff0a90 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifier/func_LPI.py @@ -0,0 +1,21 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +def evaluate(self): + """ + This method returns the variable LPI_status + @ In, None + @ Out, LPI_status, float, LPI system status + """ + return self.LPI_status diff --git a/plugins/PRAplugin/tests/dataClassifier/func_LPR.py b/plugins/PRAplugin/tests/dataClassifier/func_LPR.py new file mode 100644 index 0000000000..c5ad4e6a27 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifier/func_LPR.py @@ -0,0 +1,21 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +def evaluate(self): + """ + This method returns the variable LPR_status + @ In, None + @ Out, LPR_status, float, LPR system status + """ + return self.LPR_status diff --git a/plugins/PRAplugin/tests/dataClassifierHS/THmodelTD.py b/plugins/PRAplugin/tests/dataClassifierHS/THmodelTD.py new file mode 100644 index 0000000000..fc8594a4b9 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifierHS/THmodelTD.py @@ -0,0 +1,64 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import numpy as np + +def run(self,Input): + """ + This method computes the final status of this simplified LB LOCA event in time dependent mode + @ In, Input, dict, dictionary of inputs from RAVEN + @ Out, None + """ + self.ACC = Input['ACC_sim'] + self.time_LPI = Input['time_LPI'] + self.time_LPR = Input['time_LPR'] + + timeToCD_LPI = 6. + timeToCD_LPR = 11. + + if self.ACC == 1.: + self.time = np.array([ 0, 1, 2]) + self.temp = np.array([10,15,20]) + self.out = np.array([0.,0.,1.]) + self.ACC_status = np.array([0.,1.,1.]) + self.LPI_status = np.array([0.,0.,0.]) + self.LPR_status = np.array([0.,0.,0.]) + else: + self.LPI_act = self.time_LPI + 1. + if self.LPI_act > timeToCD_LPI: + self.time = np.array([ 0, 1, 2, 3, 4, 5]) + self.temp = np.array([10,10,10,10,15,20]) + self.out = np.array([0.,0.,0.,0.,0.,1.]) + self.ACC_status = np.array([0.,0.,0.,0.,0.,0.]) + self.LPI_status = np.array([0.,0.,0.,0.,0.,1.]) + self.LPR_status = np.array([0.,0.,0.,0.,0.,0.]) + else: + self.LPR_act = self.LPI_act + self.time_LPR + if self.LPR_act > timeToCD_LPR: + self.time = np.array([ 0, 1, 2, 3, 4, 5, 6, 7, 8]) + self.temp = np.array([10,10,10,10,10,10,10,15,20]) + self.out = np.array([0.,0.,0.,0.,0.,0.,0.,0.,1.]) + self.ACC_status = np.array([0.,0.,0.,0.,0.,0.,0.,0.,0.]) + self.LPI_status = np.array([0.,0.,0.,0.,0.,0.,0.,0.,0.]) + self.LPR_status = np.array([0.,0.,0.,0.,0.,0.,0.,0.,1.]) + else: + self.time = np.array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,10]) + self.temp = np.array([10,10,10,10,10,10,10,10,10,10,10]) + self.out = np.array([0.,0.,0.,0.,0.,0.,0.,0.,0.,0.,0.]) + self.ACC_status = np.array([0.,0.,0.,0.,0.,0.,0.,0.,0.,0.,0.]) + self.LPI_status = np.array([0.,0.,0.,0.,0.,0.,0.,0.,0.,0.,0.]) + self.LPR_status = np.array([0.,0.,0.,0.,0.,0.,0.,0.,0.,0.,0.]) + + + diff --git a/plugins/PRAplugin/tests/dataClassifierHS/eventTree.xml b/plugins/PRAplugin/tests/dataClassifierHS/eventTree.xml new file mode 100644 index 0000000000..b9c5c08e63 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifierHS/eventTree.xml @@ -0,0 +1,33 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/dataClassifierHS/func_ACC.py b/plugins/PRAplugin/tests/dataClassifierHS/func_ACC.py new file mode 100644 index 0000000000..83477ef272 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifierHS/func_ACC.py @@ -0,0 +1,23 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import numpy as np + +def evaluate(self): + """ + This method returns the variable ACC_status + @ In, None + @ Out, ACC_status, np.array, accumulator system status + """ + return np.amax(self.ACC_status) diff --git a/plugins/PRAplugin/tests/dataClassifierHS/func_LPI.py b/plugins/PRAplugin/tests/dataClassifierHS/func_LPI.py new file mode 100644 index 0000000000..2c0d38e3e7 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifierHS/func_LPI.py @@ -0,0 +1,23 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import numpy as np + +def evaluate(self): + """ + This method returns the variable LPI_status + @ In, None + @ Out, LPI_status, np.array, LPI system status + """ + return np.amax(self.LPI_status) diff --git a/plugins/PRAplugin/tests/dataClassifierHS/func_LPR.py b/plugins/PRAplugin/tests/dataClassifierHS/func_LPR.py new file mode 100644 index 0000000000..c863ea12d3 --- /dev/null +++ b/plugins/PRAplugin/tests/dataClassifierHS/func_LPR.py @@ -0,0 +1,23 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import numpy as np + +def evaluate(self): + """ + This method returns the variable LPR_status + @ In, None + @ Out, LPR_status, np.array, LPR system status + """ + return np.amax(self.LPR_status) diff --git a/plugins/PRAplugin/tests/ensembleDiscrete/FT1.xml b/plugins/PRAplugin/tests/ensembleDiscrete/FT1.xml new file mode 100644 index 0000000000..81908ac039 --- /dev/null +++ b/plugins/PRAplugin/tests/ensembleDiscrete/FT1.xml @@ -0,0 +1,17 @@ + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/ensembleDiscrete/FT2.xml b/plugins/PRAplugin/tests/ensembleDiscrete/FT2.xml new file mode 100644 index 0000000000..8c0845e13b --- /dev/null +++ b/plugins/PRAplugin/tests/ensembleDiscrete/FT2.xml @@ -0,0 +1,17 @@ + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/ensembleDiscrete/RBD.xml b/plugins/PRAplugin/tests/ensembleDiscrete/RBD.xml new file mode 100644 index 0000000000..e2024c469d --- /dev/null +++ b/plugins/PRAplugin/tests/ensembleDiscrete/RBD.xml @@ -0,0 +1,16 @@ + + + alpha + + + beta,gamma + + + ACC + + + ACC + + + + diff --git a/plugins/PRAplugin/tests/ensembleDiscrete/eventTree.xml b/plugins/PRAplugin/tests/ensembleDiscrete/eventTree.xml new file mode 100644 index 0000000000..7821d62baf --- /dev/null +++ b/plugins/PRAplugin/tests/ensembleDiscrete/eventTree.xml @@ -0,0 +1,42 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/ensembleMixed/FT1.xml b/plugins/PRAplugin/tests/ensembleMixed/FT1.xml new file mode 100644 index 0000000000..81908ac039 --- /dev/null +++ b/plugins/PRAplugin/tests/ensembleMixed/FT1.xml @@ -0,0 +1,17 @@ + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/ensembleMixed/FT2.xml b/plugins/PRAplugin/tests/ensembleMixed/FT2.xml new file mode 100644 index 0000000000..8c0845e13b --- /dev/null +++ b/plugins/PRAplugin/tests/ensembleMixed/FT2.xml @@ -0,0 +1,17 @@ + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/ensembleMixed/RBD.xml b/plugins/PRAplugin/tests/ensembleMixed/RBD.xml new file mode 100644 index 0000000000..e2024c469d --- /dev/null +++ b/plugins/PRAplugin/tests/ensembleMixed/RBD.xml @@ -0,0 +1,16 @@ + + + alpha + + + beta,gamma + + + ACC + + + ACC + + + + diff --git a/plugins/PRAplugin/tests/ensembleMixed/eventTree.xml b/plugins/PRAplugin/tests/ensembleMixed/eventTree.xml new file mode 100644 index 0000000000..7821d62baf --- /dev/null +++ b/plugins/PRAplugin/tests/ensembleMixed/eventTree.xml @@ -0,0 +1,42 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/plugins/PRAplugin/tests/gold/ETmodel/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/ETmodel/Print_sim_PS.csv new file mode 100644 index 0000000000..2c063f3c70 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/ETmodel/Print_sim_PS.csv @@ -0,0 +1,11 @@ +statusACC,statusLPI,statusLPR,sequence +1.0,1.0,0.0,3.0 +1.0,1.0,1.0,3.0 +1.0,0.0,1.0,3.0 +0.0,1.0,0.0,2.0 +1.0,1.0,1.0,3.0 +1.0,1.0,0.0,3.0 +1.0,0.0,1.0,3.0 +0.0,1.0,1.0,2.0 +1.0,1.0,1.0,3.0 +0.0,1.0,1.0,2.0 diff --git a/plugins/PRAplugin/tests/gold/ETmodelTD/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/ETmodelTD/Print_sim_PS.csv new file mode 100644 index 0000000000..1c755f04fb --- /dev/null +++ b/plugins/PRAplugin/tests/gold/ETmodelTD/Print_sim_PS.csv @@ -0,0 +1,11 @@ +statusACC,statusLPI,statusLPR,sequence +1.0,1.0,2.24664963136,3.0 +1.0,1.0,8.92634152643,3.0 +1.0,0.0,5.21344427374,3.0 +0.0,1.0,0.0979241356482,2.0 +1.0,1.0,5.15094192353,3.0 +1.0,1.0,4.25363342144,3.0 +1.0,0.0,7.06316464512,3.0 +0.0,1.0,9.3402300564,2.0 +1.0,1.0,5.73162940231,3.0 +0.0,1.0,8.40653911196,2.0 diff --git a/plugins/PRAplugin/tests/gold/FTmodel/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/FTmodel/Print_sim_PS.csv new file mode 100644 index 0000000000..2ea1a4aeb7 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/FTmodel/Print_sim_PS.csv @@ -0,0 +1,11 @@ +statusBE1,statusBE2,statusBE3,statusBE4,TOP +1.0,0.0,1.0,1.0,1 +1.0,1.0,1.0,1.0,1 +1.0,0.0,0.0,0.0,0 +1.0,1.0,1.0,1.0,1 +1.0,1.0,1.0,0.0,1 +1.0,0.0,1.0,0.0,1 +0.0,1.0,1.0,1.0,0 +0.0,1.0,1.0,1.0,0 +1.0,0.0,0.0,0.0,0 +1.0,1.0,1.0,1.0,1 diff --git a/plugins/PRAplugin/tests/gold/FTmodelTD/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/FTmodelTD/Print_sim_PS.csv new file mode 100644 index 0000000000..545fd54e03 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/FTmodelTD/Print_sim_PS.csv @@ -0,0 +1,51 @@ +statusA,statusB,statusC,statusD,TOP +2.24664963136,1.0,5.62087161132,1.0,1.0 +9.57246520081,1.0,6.41341173472,1.0,1.0 +0.00218694564006,1.0,0.0979241356482,0.0,0.00218694564006 +5.15094192353,1.0,6.96538095757,1.0,1.0 +5.32285482979,1.0,7.08862688325,0.0,5.32285482979 +1.88821980774,1.0,9.3402300564,0.0,1.88821980774 +5.73162940231,0.0,9.19546000408,1.0,9.19546000408 +5.18732519243,0.0,5.2543772583,1.0,5.2543772583 +1.41330159535,1.0,4.43653373384,0.0,1.41330159535 +5.36693474636,1.0,5.83254746297,1.0,1.0 +4.29180151883,1.0,9.11441485144,0.0,4.29180151883 +6.75597047358,0.0,5.51381058654,0.0,6.75597047358 +5.83715440143,1.0,3.45076741731,0.0,5.83715440143 +3.33802351806,0.0,9.72692777396,1.0,9.72692777396 +1.63997425037,1.0,4.59666654342,0.0,1.63997425037 +3.10175746519,1.0,0.832170900151,0.0,3.10175746519 +8.93882137931,0.0,7.31960676082,1.0,7.31960676082 +5.46719618036,0.0,7.1914006507,1.0,7.1914006507 +6.94748646276,1.0,8.30109125429,0.0,6.94748646276 +8.55504666887,0.0,9.80455060485,0.0,9.80455060485 +4.83152987082,1.0,9.77330468357,1.0,1.0 +8.657708808,0.0,0.912213421173,1.0,0.912213421173 +4.99716694816,0.0,5.65280628755,1.0,5.65280628755 +4.64883334344,0.0,1.24505858199,0.0,4.64883334344 +6.22172621456,1.0,6.62256540885,1.0,1.0 +0.164408583698,0.0,2.10937010173,0.0,2.10937010173 +8.56518048527,1.0,5.29768012587,1.0,1.0 +4.34287728377,0.0,1.25798387249,1.0,1.25798387249 +3.80918538054,0.0,4.40265037222,0.0,4.40265037222 +5.78546227556,0.0,1.6972877159,1.0,1.6972877159 +3.55900690275,1.0,6.13880779039,1.0,1.0 +8.75345056382,1.0,8.61127031236,0.0,8.75345056382 +4.63730472015,0.0,0.0551573373506,1.0,0.0551573373506 +6.21759798523,0.0,3.68494227847,0.0,6.21759798523 +6.39856767524,0.0,9.35379168236,0.0,9.35379168236 +7.62101013158,0.0,5.29917485437,0.0,7.62101013158 +1.46395325229,0.0,3.86198339142,0.0,3.86198339142 +3.95084401452,0.0,4.3662357876,0.0,4.3662357876 +9.24778517085,0.0,3.86207630715,1.0,3.86207630715 +0.0702077453188,1.0,2.49289399769,1.0,1.0 +3.97659762855,1.0,5.6533266035,1.0,1.0 +8.44837347242,1.0,8.31925248222,0.0,8.44837347242 +0.289413121131,0.0,4.79656091072,0.0,4.79656091072 +8.03152500606,0.0,6.01710352954,1.0,6.01710352954 +3.1051760244,0.0,4.43729281296,0.0,4.43729281296 +8.53342957062,1.0,3.87900359786,1.0,1.0 +6.23904481908,0.0,0.240672470592,0.0,6.23904481908 +7.27012900572,0.0,7.54094355217,1.0,7.54094355217 +1.06484041341,1.0,8.8106053669,1.0,1.0 +4.59364371016,0.0,6.89237321422,0.0,6.89237321422 diff --git a/plugins/PRAplugin/tests/gold/dataClassifier/Print_ET_PS.csv b/plugins/PRAplugin/tests/gold/dataClassifier/Print_ET_PS.csv new file mode 100644 index 0000000000..df727669f7 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/dataClassifier/Print_ET_PS.csv @@ -0,0 +1,9 @@ +ACC,LPR,LPI,sequence +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0 +0.0,0.0,1.0,2.0 +1.0,0.0,0.0,3.0 +1.0,0.0,1.0,3.0 +0.0,1.0,1.0,2.0 +1.0,1.0,0.0,3.0 +1.0,1.0,1.0,3.0 diff --git a/plugins/PRAplugin/tests/gold/dataClassifier/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/dataClassifier/Print_sim_PS.csv new file mode 100644 index 0000000000..b6352fde07 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/dataClassifier/Print_sim_PS.csv @@ -0,0 +1,11 @@ +time_LPR,ACC_status,time_LPI,LPR_status,sequence,LPI_status,out +5.34359778869,0.0,4.15626305329,0.0,0.0,0.0,0.0 +6.05535201968,1.0,5.71959019433,1.0,3.0,1.0,1.0 +5.36204600364,0.0,3.78383580565,0.0,0.0,0.0,0.0 +1.48356827064,0.0,4.16739277245,0.0,0.0,0.0,0.0 +5.66349480335,0.0,4.51446945804,1.0,1.0,0.0,1.0 +5.74180093989,0.0,4.08101624411,0.0,0.0,0.0,0.0 +5.55006524542,1.0,3.90389579323,1.0,3.0,1.0,1.0 +4.11775431501,1.0,5.69722062124,1.0,3.0,1.0,1.0 +6.93058679241,0.0,5.40202427067,1.0,2.0,1.0,1.0 +4.28168888876,1.0,4.0469727304,1.0,3.0,1.0,1.0 diff --git a/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_0.csv b/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_0.csv new file mode 100644 index 0000000000..bc1e7c841c --- /dev/null +++ b/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_0.csv @@ -0,0 +1,4 @@ +time,temp,out,ACC_status,LPI_status,LPR_status,sequence +0,10.0,0.0,0.0,0.0,0.0,3.0 +1,15.0,0.0,1.0,0.0,0.0,3.0 +2,20.0,1.0,1.0,0.0,0.0,3.0 diff --git a/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_3.csv b/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_3.csv new file mode 100644 index 0000000000..bcd32e363e --- /dev/null +++ b/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_3.csv @@ -0,0 +1,12 @@ +time,temp,out,ACC_status,LPI_status,LPR_status,sequence +0,10.0,0.0,0.0,0.0,0.0,0.0 +1,10.0,0.0,0.0,0.0,0.0,0.0 +2,10.0,0.0,0.0,0.0,0.0,0.0 +3,10.0,0.0,0.0,0.0,0.0,0.0 +4,10.0,0.0,0.0,0.0,0.0,0.0 +5,10.0,0.0,0.0,0.0,0.0,0.0 +6,10.0,0.0,0.0,0.0,0.0,0.0 +7,10.0,0.0,0.0,0.0,0.0,0.0 +8,10.0,0.0,0.0,0.0,0.0,0.0 +9,10.0,0.0,0.0,0.0,0.0,0.0 +10,10.0,0.0,0.0,0.0,0.0,0.0 diff --git a/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_7.csv b/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_7.csv new file mode 100644 index 0000000000..11c15965d3 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_7.csv @@ -0,0 +1,7 @@ +time,temp,out,ACC_status,LPI_status,LPR_status,sequence +0,10.0,0.0,0.0,0.0,0.0,2.0 +1,10.0,0.0,0.0,0.0,0.0,2.0 +2,10.0,0.0,0.0,0.0,0.0,2.0 +3,10.0,0.0,0.0,0.0,0.0,2.0 +4,15.0,0.0,0.0,0.0,0.0,2.0 +5,20.0,1.0,0.0,1.0,0.0,2.0 diff --git a/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_9.csv b/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_9.csv new file mode 100644 index 0000000000..3cb8ccf4f2 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/dataClassifierHS/Print_sim_PS_9.csv @@ -0,0 +1,10 @@ +time,temp,out,ACC_status,LPI_status,LPR_status,sequence +0,10.0,0.0,0.0,0.0,0.0,1.0 +1,10.0,0.0,0.0,0.0,0.0,1.0 +2,10.0,0.0,0.0,0.0,0.0,1.0 +3,10.0,0.0,0.0,0.0,0.0,1.0 +4,10.0,0.0,0.0,0.0,0.0,1.0 +5,10.0,0.0,0.0,0.0,0.0,1.0 +6,10.0,0.0,0.0,0.0,0.0,1.0 +7,15.0,0.0,0.0,0.0,0.0,1.0 +8,20.0,1.0,0.0,0.0,1.0,1.0 diff --git a/plugins/PRAplugin/tests/gold/ensembleDiscrete/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/ensembleDiscrete/Print_sim_PS.csv new file mode 100644 index 0000000000..39740a431d --- /dev/null +++ b/plugins/PRAplugin/tests/gold/ensembleDiscrete/Print_sim_PS.csv @@ -0,0 +1,100 @@ +A,B,C,beta,gamma,initialState,initEvent,sequence +1.0,0.0,0.0,1.0,1.0,0.0,0.0,4.0 +1.0,0.0,0.0,1.0,0.0,1.0,0.0,3.0 +0.0,1.0,1.0,1.0,0.0,1.0,1.0,5.0 +1.0,1.0,0.0,0.0,0.0,0.0,0.0,3.0 +1.0,1.0,0.0,1.0,0.0,1.0,0.0,3.0 +1.0,0.0,1.0,1.0,0.0,1.0,0.0,4.0 +1.0,1.0,0.0,0.0,0.0,1.0,1.0,5.0 +0.0,0.0,0.0,0.0,0.0,1.0,0.0,3.0 +1.0,1.0,0.0,0.0,1.0,1.0,1.0,5.0 +1.0,0.0,0.0,1.0,0.0,1.0,0.0,3.0 +1.0,0.0,1.0,0.0,1.0,1.0,1.0,5.0 +1.0,0.0,0.0,1.0,1.0,0.0,0.0,4.0 +0.0,0.0,0.0,0.0,1.0,0.0,0.0,3.0 +1.0,0.0,0.0,0.0,1.0,0.0,0.0,3.0 +1.0,1.0,0.0,0.0,0.0,1.0,0.0,3.0 +0.0,0.0,0.0,1.0,0.0,0.0,0.0,3.0 +0.0,0.0,1.0,1.0,1.0,1.0,0.0,4.0 +1.0,1.0,0.0,0.0,0.0,0.0,0.0,3.0 +1.0,0.0,0.0,1.0,0.0,0.0,1.0,5.0 +0.0,1.0,0.0,0.0,0.0,0.0,0.0,3.0 +0.0,0.0,0.0,0.0,1.0,0.0,0.0,3.0 +0.0,1.0,0.0,0.0,1.0,0.0,0.0,3.0 +0.0,1.0,0.0,1.0,1.0,0.0,0.0,4.0 +0.0,0.0,0.0,1.0,1.0,0.0,0.0,4.0 +0.0,0.0,0.0,0.0,1.0,0.0,1.0,5.0 +0.0,0.0,0.0,0.0,1.0,1.0,0.0,3.0 +1.0,0.0,0.0,0.0,1.0,1.0,0.0,3.0 +0.0,1.0,0.0,0.0,1.0,0.0,0.0,3.0 +0.0,1.0,0.0,1.0,0.0,1.0,0.0,3.0 +1.0,1.0,0.0,0.0,1.0,0.0,0.0,3.0 +1.0,1.0,0.0,1.0,1.0,1.0,0.0,4.0 +0.0,1.0,1.0,1.0,0.0,0.0,0.0,3.0 +1.0,0.0,0.0,0.0,1.0,0.0,0.0,3.0 +1.0,0.0,0.0,0.0,1.0,0.0,0.0,3.0 +0.0,1.0,0.0,0.0,1.0,1.0,0.0,3.0 +0.0,1.0,0.0,1.0,1.0,0.0,0.0,4.0 +1.0,0.0,0.0,0.0,1.0,1.0,1.0,5.0 +1.0,0.0,0.0,1.0,0.0,1.0,0.0,3.0 +0.0,0.0,0.0,0.0,0.0,0.0,0.0,3.0 +1.0,1.0,0.0,0.0,0.0,1.0,0.0,3.0 +1.0,1.0,0.0,1.0,1.0,0.0,1.0,5.0 +0.0,1.0,0.0,0.0,0.0,0.0,1.0,5.0 +1.0,1.0,0.0,1.0,0.0,0.0,0.0,3.0 +0.0,1.0,0.0,0.0,0.0,0.0,0.0,3.0 +0.0,0.0,0.0,0.0,0.0,0.0,0.0,3.0 +0.0,0.0,0.0,1.0,1.0,1.0,0.0,4.0 +1.0,0.0,0.0,0.0,1.0,1.0,1.0,5.0 +1.0,0.0,0.0,1.0,1.0,1.0,0.0,4.0 +1.0,1.0,0.0,0.0,0.0,1.0,0.0,3.0 +1.0,1.0,0.0,1.0,1.0,0.0,0.0,4.0 +0.0,0.0,0.0,1.0,0.0,0.0,1.0,5.0 +0.0,0.0,0.0,0.0,1.0,1.0,0.0,3.0 +0.0,0.0,0.0,0.0,0.0,0.0,0.0,3.0 +1.0,1.0,0.0,1.0,0.0,1.0,0.0,3.0 +1.0,1.0,0.0,0.0,1.0,1.0,0.0,3.0 +0.0,0.0,0.0,0.0,1.0,1.0,0.0,3.0 +0.0,1.0,0.0,1.0,1.0,0.0,0.0,4.0 +0.0,0.0,0.0,0.0,1.0,1.0,0.0,3.0 +1.0,0.0,0.0,0.0,1.0,0.0,0.0,3.0 +1.0,1.0,0.0,1.0,0.0,1.0,0.0,3.0 +0.0,1.0,0.0,0.0,1.0,0.0,0.0,3.0 +0.0,1.0,0.0,0.0,1.0,1.0,0.0,3.0 +1.0,0.0,0.0,1.0,1.0,1.0,0.0,4.0 +1.0,1.0,0.0,0.0,1.0,1.0,0.0,3.0 +1.0,1.0,0.0,0.0,1.0,1.0,0.0,3.0 +1.0,1.0,0.0,1.0,0.0,1.0,0.0,3.0 +1.0,1.0,0.0,0.0,1.0,1.0,0.0,3.0 +0.0,0.0,0.0,0.0,0.0,1.0,0.0,3.0 +0.0,0.0,0.0,0.0,0.0,1.0,0.0,3.0 +0.0,1.0,0.0,0.0,0.0,1.0,0.0,3.0 +0.0,0.0,0.0,1.0,0.0,0.0,0.0,3.0 +1.0,0.0,0.0,0.0,0.0,0.0,0.0,3.0 +0.0,1.0,1.0,1.0,0.0,1.0,0.0,3.0 +1.0,0.0,0.0,1.0,1.0,1.0,1.0,5.0 +1.0,0.0,0.0,1.0,1.0,0.0,1.0,5.0 +0.0,1.0,0.0,0.0,0.0,0.0,0.0,3.0 +0.0,1.0,0.0,1.0,0.0,0.0,0.0,3.0 +0.0,1.0,0.0,1.0,0.0,0.0,0.0,3.0 +1.0,1.0,0.0,1.0,0.0,1.0,0.0,3.0 +1.0,1.0,0.0,1.0,1.0,1.0,0.0,4.0 +0.0,1.0,0.0,0.0,1.0,1.0,1.0,5.0 +0.0,0.0,0.0,1.0,0.0,0.0,0.0,3.0 +0.0,1.0,0.0,0.0,0.0,1.0,1.0,5.0 +1.0,1.0,0.0,1.0,1.0,1.0,0.0,4.0 +0.0,1.0,0.0,0.0,1.0,0.0,0.0,3.0 +0.0,1.0,0.0,1.0,1.0,1.0,0.0,4.0 +1.0,0.0,0.0,0.0,1.0,0.0,0.0,3.0 +0.0,0.0,0.0,0.0,1.0,1.0,0.0,3.0 +1.0,1.0,0.0,0.0,1.0,0.0,0.0,3.0 +1.0,1.0,0.0,1.0,1.0,0.0,0.0,4.0 +0.0,0.0,0.0,0.0,0.0,0.0,0.0,3.0 +0.0,1.0,0.0,0.0,0.0,0.0,0.0,3.0 +1.0,0.0,0.0,0.0,0.0,1.0,0.0,3.0 +0.0,0.0,0.0,1.0,1.0,1.0,0.0,4.0 +0.0,1.0,1.0,0.0,0.0,0.0,0.0,3.0 +1.0,0.0,0.0,0.0,0.0,1.0,0.0,3.0 +0.0,0.0,1.0,0.0,1.0,0.0,0.0,3.0 +1.0,0.0,0.0,0.0,0.0,1.0,0.0,3.0 +1.0,0.0,0.0,0.0,1.0,0.0,0.0,3.0 diff --git a/plugins/PRAplugin/tests/gold/ensembleMixed/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/ensembleMixed/Print_sim_PS.csv new file mode 100644 index 0000000000..c4393d8432 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/ensembleMixed/Print_sim_PS.csv @@ -0,0 +1,100 @@ +A,B,C,beta,gamma,initialState,initEvent,sequence,statusLPI,statusACC,LPR +1.0,0.00218694564006,4.14429881241,5.66469494152,1.0,0.0,0.0,4.0,0,4.14429881241,0.00218694564006 +1.0,4.25363342144,7.70896024483,7.08862688325,0.0,1.0,0.0,4.0,1,7.70896024483,4.25363342144 +0.0,9.55172521052,9.3402300564,5.73162940231,0.0,1.0,1.0,5.0,1,0.0,0.0 +1.0,5.2543772583,5.18732519243,4.18491912637,0.0,0.0,0.0,4.0,0,5.18732519243,5.2543772583 +1.0,5.36693474636,6.7550185548,7.22607659344,0.0,1.0,0.0,4.0,1,6.7550185548,5.36693474636 +1.0,3.01352649532,9.43836184671,5.51381058654,0.0,1.0,0.0,4.0,1,9.43836184671,3.01352649532 +1.0,7.18080360377,3.45076741731,3.33802351806,0.0,1.0,1.0,5.0,0,3.45076741731,7.18080360377 +0.0,4.59666654342,1.63997425037,3.49614854983,0.0,1.0,0.0,3.0,1,0.0,0.0 +1.0,8.93882137931,7.62914438677,3.11235579967,1.0,1.0,1.0,5.0,1,3.11235579967,8.93882137931 +1.0,2.21913621114,0.169086421414,8.30109125429,0.0,1.0,0.0,4.0,0,0.169086421414,2.21913621114 +1.0,4.07595465986,9.80455060485,4.83152987082,1.0,1.0,1.0,5.0,0,4.83152987082,4.07595465986 +1.0,0.912213421173,8.657708808,5.58289473774,1.0,0.0,0.0,4.0,1,5.58289473774,0.912213421173 +0.0,4.64883334344,1.81724623819,1.46987809135,1.0,0.0,0.0,4.0,1,1.46987809135,0.0 +1.0,1.43153963411,5.54073200923,2.10937010173,1.0,0.0,0.0,4.0,0,2.10937010173,1.43153963411 +1.0,5.45302780472,5.29768012587,4.34287728377,0.0,1.0,0.0,4.0,0,5.29768012587,5.45302780472 +0.0,4.40265037222,3.80918538054,5.87710535291,0.0,0.0,0.0,3.0,1,0.0,0.0 +0.0,3.55900690275,9.9808973493,9.04965425819,1.0,1.0,0.0,4.0,1,9.04965425819,0.0 +1.0,9.6253840671,6.80387449144,0.0551573373506,0.0,0.0,0.0,4.0,1,6.80387449144,9.6253840671 +1.0,0.354378188577,3.68494227847,6.39856767524,0.0,0.0,1.0,5.0,1,3.68494227847,0.354378188577 +0.0,5.29917485437,7.62101013158,1.44935434485,0.0,0.0,0.0,3.0,1,0.0,0.0 +0.0,3.95084401452,3.6894097863,2.20089034927,1.0,0.0,0.0,4.0,0,2.20089034927,0.0 +0.0,8.49307628313,2.21242859313,2.49289399769,1.0,0.0,0.0,4.0,1,2.49289399769,0.0 +0.0,8.43483956029,5.6533266035,8.44837347242,1.0,0.0,0.0,4.0,1,8.44837347242,0.0 +0.0,4.79656091072,0.289413121131,7.70198905089,1.0,0.0,0.0,4.0,1,7.70198905089,0.0 +0.0,3.1051760244,0.455278316619,3.49934295134,1.0,0.0,1.0,5.0,0,3.49934295134,0.0 +0.0,4.18249898455,6.3636691464,0.240672470592,1.0,1.0,0.0,4.0,1,0.240672470592,0.0 +1.0,4.87968352504,7.54094355217,1.06484041341,1.0,1.0,0.0,4.0,1,1.06484041341,4.87968352504 +0.0,6.89237321422,4.59364371016,0.498994786874,1.0,0.0,0.0,4.0,1,0.498994786874,0.0 +0.0,6.44128401215,8.33101266956,6.73068176413,0.0,1.0,0.0,3.0,1,0.0,0.0 +1.0,6.55315397693,7.61717799064,3.06134594908,1.0,0.0,0.0,4.0,1,3.06134594908,6.55315397693 +1.0,6.89799122673,6.75239708897,5.11987986861,1.0,1.0,0.0,4.0,0,5.11987986861,6.89799122673 +0.0,8.45839295035,9.03994753469,6.01018668292,0.0,0.0,0.0,3.0,1,0.0,0.0 +1.0,1.12727695404,1.17933260537,3.37643175697,1.0,0.0,0.0,4.0,1,1.17933260537,1.12727695404 +1.0,2.85006864528,3.95165465166,3.7290574712,1.0,0.0,0.0,4.0,1,3.7290574712,2.85006864528 +0.0,7.42456809558,2.45735396921,1.47698372171,1.0,1.0,0.0,4.0,1,1.47698372171,0.0 +0.0,6.55568846188,6.04068904092,5.01132703969,1.0,0.0,0.0,4.0,1,5.01132703969,0.0 +1.0,4.59885790353,0.133609971994,3.26025241829,1.0,1.0,1.0,5.0,0,0.133609971994,4.59885790353 +1.0,0.677404147731,3.71787967247,8.01412862446,0.0,1.0,0.0,4.0,1,3.71787967247,0.677404147731 +0.0,1.86849705918,1.77479163319,1.55260221137,0.0,0.0,0.0,3.0,1,0.0,0.0 +1.0,9.38303023097,4.96203430112,2.81922152564,0.0,1.0,0.0,4.0,1,4.96203430112,9.38303023097 +1.0,7.94178552179,4.70368516042,5.88011842358,1.0,0.0,1.0,5.0,1,4.70368516042,7.94178552179 +0.0,9.08314684385,6.941086272,0.703151985235,0.0,0.0,1.0,5.0,0,0.0,0.0 +1.0,7.99031138373,1.90521047029,8.68762779252,0.0,0.0,0.0,4.0,1,1.90521047029,7.99031138373 +0.0,7.28263983673,6.85495635421,1.10833024632,0.0,0.0,0.0,3.0,1,0.0,0.0 +0.0,0.636644354238,1.97245596488,0.635410915277,0.0,0.0,0.0,3.0,1,0.0,0.0 +0.0,4.50673643372,8.63804379214,6.95537766371,1.0,1.0,0.0,4.0,1,6.95537766371,0.0 +1.0,4.99694360304,2.49661408423,3.47006510558,1.0,1.0,1.0,5.0,1,2.49661408423,4.99694360304 +1.0,1.17612074389,6.74841013661,6.35346718979,1.0,1.0,0.0,4.0,1,6.35346718979,1.17612074389 +1.0,9.80806472707,8.27573800419,1.4423927086,0.0,1.0,0.0,4.0,1,8.27573800419,9.80806472707 +1.0,6.77450179979,6.99028219492,8.37503089765,1.0,0.0,0.0,4.0,1,6.99028219492,6.77450179979 +0.0,4.4342601752,1.77654420766,9.5699858129,0.0,0.0,1.0,5.0,1,0.0,0.0 +0.0,1.22238068637,3.78161607631,1.9146918766,1.0,1.0,0.0,4.0,1,1.9146918766,0.0 +0.0,3.55015245116,4.76743555506,0.0280287349662,0.0,0.0,0.0,3.0,1,0.0,0.0 +1.0,5.93001340421,6.83045362747,9.10858865108,0.0,1.0,0.0,4.0,0,6.83045362747,5.93001340421 +1.0,5.46360976656,6.67188400092,3.59270292418,1.0,1.0,0.0,4.0,1,3.59270292418,5.46360976656 +0.0,3.2765606682,7.98403746867,4.47451354109,1.0,1.0,0.0,4.0,1,4.47451354109,0.0 +0.0,9.73444854834,8.17785089327,7.97665569186,1.0,0.0,0.0,4.0,1,7.97665569186,0.0 +0.0,1.86936244179,2.58831020738,0.693176652001,1.0,1.0,0.0,4.0,1,0.693176652001,0.0 +1.0,2.82943655803,5.80511468831,3.51478817955,1.0,0.0,0.0,4.0,1,3.51478817955,2.82943655803 +1.0,9.39081176636,1.57252191603,7.24566542479,0.0,1.0,0.0,4.0,1,1.57252191603,9.39081176636 +0.0,7.50460027193,1.956251143,1.39900458078,1.0,0.0,0.0,4.0,1,1.39900458078,0.0 +0.0,9.35415084459,8.12100484691,3.25060153456,1.0,1.0,0.0,4.0,0,3.25060153456,0.0 +1.0,4.1001949888,7.56814003167,6.16836350322,1.0,1.0,0.0,4.0,1,6.16836350322,4.1001949888 +1.0,5.93417948716,2.00107234344,0.269116456683,1.0,1.0,0.0,4.0,1,0.269116456683,5.93417948716 +1.0,9.65623473741,7.34729092274,4.96202675741,1.0,1.0,0.0,4.0,1,4.96202675741,9.65623473741 +1.0,7.10075169734,2.162456336,5.06270544489,0.0,1.0,0.0,4.0,1,2.162456336,7.10075169734 +1.0,5.94902319041,0.39049032619,3.19584609084,1.0,1.0,0.0,4.0,0,0.39049032619,5.94902319041 +0.0,1.33747477116,6.86680188795,3.16664250408,0.0,1.0,0.0,3.0,1,0.0,0.0 +0.0,0.808433594836,5.18443324258,2.46476076601,0.0,1.0,0.0,3.0,1,0.0,0.0 +0.0,5.82665436338,5.10184910267,0.343957133671,0.0,1.0,0.0,3.0,1,0.0,0.0 +0.0,2.75156688941,2.09412899848,7.70149086549,0.0,0.0,0.0,3.0,0,0.0,0.0 +1.0,2.68087735462,5.53929805885,4.34173863715,0.0,0.0,0.0,4.0,1,5.53929805885,2.68087735462 +0.0,6.73550991498,9.05351445988,6.71041515812,0.0,1.0,0.0,3.0,1,0.0,0.0 +1.0,1.61872467529,2.16490039419,8.86819056209,1.0,1.0,1.0,5.0,0,2.16490039419,1.61872467529 +1.0,1.7439425438,2.21996572386,7.11726568572,1.0,0.0,1.0,5.0,1,2.21996572386,1.7439425438 +0.0,8.04845637131,0.392516239172,1.93972367373,0.0,0.0,0.0,3.0,1,0.0,0.0 +0.0,5.02195404028,6.43018626059,9.59935455341,0.0,0.0,0.0,3.0,1,0.0,0.0 +0.0,7.75523116993,1.42806342603,6.67114165301,0.0,0.0,0.0,3.0,1,0.0,0.0 +1.0,7.56580631658,5.22219675249,9.95024263392,0.0,1.0,0.0,4.0,1,5.22219675249,7.56580631658 +1.0,7.86032832643,6.99939286732,7.42811013652,1.0,1.0,0.0,4.0,0,6.99939286732,7.86032832643 +0.0,7.50263508584,6.54425304303,4.28507070855,1.0,1.0,1.0,5.0,1,4.28507070855,0.0 +0.0,2.64065889936,7.90666283292,7.02243508702,0.0,0.0,0.0,3.0,1,0.0,0.0 +0.0,8.65202246668,7.40392885809,0.253916871793,0.0,1.0,1.0,5.0,1,0.0,0.0 +1.0,8.46854810102,2.47185366053,9.97807608917,1.0,1.0,0.0,4.0,1,2.47185366053,8.46854810102 +0.0,5.9101202539,0.674002291792,1.47139536018,1.0,0.0,0.0,4.0,0,1.47139536018,0.0 +0.0,6.44146880285,1.79377027363,6.67827959095,1.0,1.0,0.0,4.0,1,6.67827959095,0.0 +1.0,3.82769570542,6.6859532396,4.4091432319,1.0,0.0,0.0,4.0,1,4.4091432319,3.82769570542 +0.0,0.961699229889,8.31271007851,0.339264341243,1.0,1.0,0.0,4.0,1,0.339264341243,0.0 +1.0,5.61299407054,5.59939913582,0.160103668496,1.0,0.0,0.0,4.0,1,0.160103668496,5.61299407054 +1.0,6.65017903704,6.59522676295,6.34793692882,1.0,0.0,0.0,4.0,1,6.34793692882,6.65017903704 +0.0,3.3500690766,6.8652275663,0.611865560201,0.0,0.0,0.0,3.0,1,0.0,0.0 +0.0,5.38139899852,8.71816405298,3.71399242052,0.0,0.0,0.0,3.0,1,0.0,0.0 +1.0,0.970028306118,1.6438010106,0.742181390697,0.0,1.0,0.0,4.0,0,1.6438010106,0.970028306118 +0.0,0.0985520659244,1.73675925511,5.30344101957,1.0,1.0,0.0,4.0,1,5.30344101957,0.0 +0.0,8.80758202374,9.69511730589,0.69979190377,0.0,0.0,0.0,3.0,0,0.0,0.0 +1.0,0.20615911116,8.30789464952,1.27399465099,0.0,1.0,0.0,4.0,1,8.30789464952,0.20615911116 +0.0,3.52886008414,9.46424218581,4.5290026801,1.0,0.0,0.0,4.0,1,4.5290026801,0.0 +1.0,2.56807138272,2.75147307728,4.52332474862,0.0,1.0,0.0,4.0,1,2.75147307728,2.56807138272 +1.0,1.72498936339,0.183340190021,2.85774046389,1.0,0.0,0.0,4.0,1,0.183340190021,1.72498936339 diff --git a/plugins/PRAplugin/tests/gold/graphModel/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/graphModel/Print_sim_PS.csv new file mode 100644 index 0000000000..ec90ef7cca --- /dev/null +++ b/plugins/PRAplugin/tests/gold/graphModel/Print_sim_PS.csv @@ -0,0 +1,21 @@ +status2,status3,status4,status5,statusSG1,statusSG2,statusSG3 +1.0,1.0,0.0,1.0,1.0,1.0,1.0 +1.0,1.0,1.0,1.0,1.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0,0.0,0.0 +1.0,1.0,1.0,1.0,1.0,1.0,1.0 +1.0,1.0,1.0,0.0,1.0,1.0,1.0 +1.0,1.0,0.0,0.0,0.0,0.0,0.0 +0.0,1.0,1.0,1.0,1.0,1.0,1.0 +0.0,1.0,1.0,1.0,1.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0,0.0,0.0 +1.0,1.0,1.0,1.0,1.0,1.0,1.0 +1.0,1.0,0.0,0.0,0.0,0.0,0.0 +0.0,1.0,1.0,0.0,0.0,0.0,0.0 +1.0,0.0,1.0,0.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0,1.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0,0.0,0.0 +1.0,0.0,0.0,0.0,0.0,0.0,0.0 +0.0,1.0,1.0,1.0,1.0,1.0,1.0 +0.0,1.0,1.0,1.0,1.0,1.0,1.0 +1.0,1.0,1.0,0.0,1.0,1.0,1.0 +0.0,1.0,1.0,0.0,0.0,0.0,0.0 diff --git a/plugins/PRAplugin/tests/gold/graphModelTD/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/graphModelTD/Print_sim_PS.csv new file mode 100644 index 0000000000..9fb19bed02 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/graphModelTD/Print_sim_PS.csv @@ -0,0 +1,101 @@ +statusA,statusB,statusC,statusD,statusOUT +0.0,8.54367820978,1.0,6.34425606028,6.34425606028 +1.0,5.21344427374,1.0,8.92634152643,1.0 +0.0,5.66469494152,0.0,4.14429881241,4.14429881241 +1.0,7.70896024483,1.0,7.46493146696,1.0 +1.0,7.06316464512,1.0,4.25363342144,1.0 +0.0,9.55172521052,1.0,4.61718905126,4.61718905126 +1.0,2.36282740309,1.0,9.73232913756,1.0 +1.0,4.33213918804,1.0,8.40653911196,1.0 +0.0,7.88976466467,0.0,4.18491912637,4.18491912637 +1.0,7.22607659344,1.0,6.7550185548,1.0 +0.0,9.43836184671,1.0,0.846319017198,0.846319017198 +1.0,1.36711772796,1.0,3.01352649532,1.0 +1.0,7.18080360377,0.0,3.40591863343,3.40591863343 +0.0,0.556951952297,1.0,8.49178585887,8.49178585887 +0.0,7.20294083403,0.0,4.22058026405,4.22058026405 +0.0,9.77032388555,0.0,3.49614854983,3.49614854983 +1.0,3.11235579967,1.0,7.62914438677,1.0 +1.0,0.169086421414,1.0,9.93805130477,1.0 +1.0,6.32030571958,1.0,2.21913621114,1.0 +1.0,4.07595465986,1.0,4.52421936312,1.0 +0.0,5.51202463347,1.0,9.23295204743,9.23295204743 +1.0,2.43672777955,0.0,6.76328596816,2.43672777955 +0.0,2.46745002746,1.0,5.58289473774,5.58289473774 +0.0,1.46987809135,0.0,1.81724623819,1.81724623819 +1.0,5.54073200923,1.0,6.76278378739,1.0 +0.0,3.14721435615,0.0,1.43153963411,1.43153963411 +1.0,5.45302780472,1.0,5.60297877891,1.0 +0.0,1.88790989851,0.0,7.05928655273,7.05928655273 +0.0,3.18118672659,0.0,1.7570706135,1.7570706135 +1.0,0.751518469944,0.0,5.87710535291,0.751518469944 +0.0,9.04965425819,1.0,9.9808973493,9.9808973493 +1.0,6.80387449144,1.0,2.80808676565,1.0 +0.0,1.37576379147,0.0,9.6253840671,9.6253840671 +1.0,0.354378188577,0.0,4.99468520866,0.354378188577 +1.0,2.09142977653,1.0,4.96545464615,1.0 +1.0,1.05172955223,1.0,3.44798815517,1.0 +0.0,0.839016882432,0.0,1.44935434485,1.44935434485 +0.0,2.20089034927,0.0,3.6894097863,3.6894097863 +1.0,2.21242859313,0.0,8.86673062082,2.21242859313 +0.0,6.70997979741,0.0,8.49307628313,8.49307628313 +0.0,8.43483956029,1.0,9.79838806433,9.79838806433 +1.0,7.23542866,1.0,1.43871539772,1.0 +0.0,0.00220919027976,0.0,2.10153283135,2.10153283135 +1.0,0.426537555276,1.0,7.70198905089,1.0 +0.0,3.49934295134,0.0,0.455278316619,0.455278316619 +1.0,6.3636691464,0.0,9.70469919958,6.3636691464 +1.0,2.1524073142,0.0,4.18249898455,2.1524073142 +1.0,4.87968352504,1.0,9.03511989606,1.0 +0.0,5.83087804165,1.0,6.18160137818,6.18160137818 +0.0,4.03369626124,1.0,4.94684698641,4.94684698641 +0.0,4.32740508447,1.0,0.498994786874,0.498994786874 +1.0,6.73068176413,1.0,8.33101266956,1.0 +0.0,7.61717799064,1.0,0.0695761828845,0.0695761828845 +0.0,4.66651448856,0.0,6.55315397693,6.55315397693 +1.0,6.89799122673,1.0,6.01954591834,1.0 +1.0,5.90383170077,0.0,8.94641244294,5.90383170077 +1.0,4.51820713573,1.0,2.2190989303,1.0 +0.0,5.95145043124,0.0,6.01018668292,6.01018668292 +0.0,3.37643175697,0.0,1.17933260537,1.17933260537 +1.0,3.95165465166,1.0,2.36254606451,1.0 +0.0,1.71627178828,0.0,2.85006864528,2.85006864528 +0.0,7.42456809558,0.0,9.08948720179,9.08948720179 +0.0,5.87469437064,1.0,5.12172996419,5.12172996419 +1.0,3.69180834472,1.0,4.07222080838,1.0 +0.0,9.05994809444,1.0,5.01132703969,5.01132703969 +0.0,3.26025241829,1.0,0.133609971994,0.133609971994 +1.0,3.71787967247,1.0,9.80906389882,1.0 +1.0,4.12321291494,1.0,0.677404147731,1.0 +0.0,1.86849705918,0.0,1.20237762369,1.20237762369 +0.0,1.25651137933,0.0,1.46793940139,1.46793940139 +0.0,6.90932710117,1.0,5.73289396375,5.73289396375 +0.0,9.13233185167,0.0,2.81922152564,2.81922152564 +1.0,5.88011842358,0.0,4.70368516042,4.70368516042 +1.0,6.941086272,0.0,9.32184133896,6.941086272 +0.0,9.11044156158,0.0,9.08314684385,9.08314684385 +1.0,7.99031138373,0.0,0.138341446905,0.138341446905 +1.0,1.80975065842,0.0,1.91032195741,1.80975065842 +1.0,2.75970318885,1.0,4.03804516747,1.0 +1.0,2.49442232598,0.0,1.10833024632,1.10833024632 +0.0,0.635410915277,0.0,1.97245596488,1.97245596488 +0.0,8.63804379214,0.0,5.96010132366,5.96010132366 +1.0,4.0640885113,1.0,4.50673643372,1.0 +1.0,4.99694360304,0.0,5.39283489468,4.99694360304 +0.0,9.24575758615,1.0,5.43842963535,5.43842963535 +1.0,9.45598376204,0.0,6.15800469791,6.15800469791 +0.0,8.46160003414,1.0,6.35346718979,6.35346718979 +1.0,1.4423927086,1.0,8.27573800419,1.0 +0.0,6.99028219492,1.0,1.05700696377,1.05700696377 +0.0,5.92126208263,1.0,6.77450179979,6.77450179979 +0.0,4.4342601752,0.0,6.40692928257,6.40692928257 +1.0,0.915649514859,1.0,4.44277480581,1.0 +0.0,5.98522425769,0.0,3.56713719516,3.56713719516 +0.0,0.413341699264,1.0,1.9146918766,1.9146918766 +0.0,0.0280287349662,0.0,4.76743555506,4.76743555506 +0.0,6.83045362747,1.0,5.47946552874,5.47946552874 +1.0,8.11790880238,1.0,5.93001340421,1.0 +1.0,5.46360976656,1.0,2.50187717204,1.0 +0.0,7.92058132773,0.0,8.86456505835,8.86456505835 +1.0,8.34977439566,0.0,0.713679057246,0.713679057246 +0.0,4.39800174311,1.0,4.47451354109,4.47451354109 diff --git a/plugins/PRAplugin/tests/gold/markovModel_2states/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/markovModel_2states/Print_sim_PS.csv new file mode 100644 index 0000000000..f8d88cf299 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/markovModel_2states/Print_sim_PS.csv @@ -0,0 +1,101 @@ +initialState,finalState +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 diff --git a/plugins/PRAplugin/tests/gold/markovModel_2states_tau/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/markovModel_2states_tau/Print_sim_PS.csv new file mode 100644 index 0000000000..163cad9c51 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/markovModel_2states_tau/Print_sim_PS.csv @@ -0,0 +1,101 @@ +initialState,finalState +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 diff --git a/plugins/PRAplugin/tests/gold/markovModel_3states/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/markovModel_3states/Print_sim_PS.csv new file mode 100644 index 0000000000..b6b11797a6 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/markovModel_3states/Print_sim_PS.csv @@ -0,0 +1,101 @@ +initialState,finalState +1.0,3 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,3 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,3 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,3 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,3 +1.0,3 +1.0,1 diff --git a/plugins/PRAplugin/tests/gold/markovModel_3states_complexTrans/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/markovModel_3states_complexTrans/Print_sim_PS.csv new file mode 100644 index 0000000000..5d8ff16027 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/markovModel_3states_complexTrans/Print_sim_PS.csv @@ -0,0 +1,101 @@ +initialState,finalState +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 diff --git a/plugins/PRAplugin/tests/gold/markovModel_3states_instantTrans/Print_sim_PS.csv b/plugins/PRAplugin/tests/gold/markovModel_3states_instantTrans/Print_sim_PS.csv new file mode 100644 index 0000000000..05a55ec613 --- /dev/null +++ b/plugins/PRAplugin/tests/gold/markovModel_3states_instantTrans/Print_sim_PS.csv @@ -0,0 +1,101 @@ +initialState,finalState +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,2 +1.0,1 +1.0,1 +1.0,1 +1.0,2 +1.0,1 diff --git a/plugins/PRAplugin/tests/graphModel/graphTest.xml b/plugins/PRAplugin/tests/graphModel/graphTest.xml new file mode 100644 index 0000000000..bdb20b1109 --- /dev/null +++ b/plugins/PRAplugin/tests/graphModel/graphTest.xml @@ -0,0 +1,35 @@ + + + 1 + + + 2,3,4 + + + 5 + + + 5 + + + 5 + + + 6,7,8 + + + SG1,SG2,SG3 + + + SG1,SG2,SG3 + + + SG1,SG2,SG3 + + + + + + + + diff --git a/plugins/PRAplugin/tests/graphModelTD/graphTestTD.xml b/plugins/PRAplugin/tests/graphModelTD/graphTestTD.xml new file mode 100644 index 0000000000..51ef1d55be --- /dev/null +++ b/plugins/PRAplugin/tests/graphModelTD/graphTestTD.xml @@ -0,0 +1,19 @@ + + + A,B + + + D + + + C + + + D + + + out + + + + diff --git a/plugins/PRAplugin/tests/test_ETmodel.xml b/plugins/PRAplugin/tests/test_ETmodel.xml new file mode 100644 index 0000000000..2e936f1ab6 --- /dev/null +++ b/plugins/PRAplugin/tests/test_ETmodel.xml @@ -0,0 +1,76 @@ + + + + ETmodel + simRun + 1 + + + + eventTree.xml + + + + + statusACC,statusLPI,statusLPR,sequence + + ACC + LPI + LPR + sequence + + + + + +

0.5

+
+
+ + + + + 10 + + + distrib + + + distrib + + + distrib + + + + + + + eventTreeTest + ET + MC_external + sim_PS + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + + statusACC,statusLPI,statusLPR + OutputPlaceHolder + + + statusACC,statusLPI,statusLPR + sequence + + + +
diff --git a/plugins/PRAplugin/tests/test_ETmodel_TD.xml b/plugins/PRAplugin/tests/test_ETmodel_TD.xml new file mode 100644 index 0000000000..8ca29a9095 --- /dev/null +++ b/plugins/PRAplugin/tests/test_ETmodel_TD.xml @@ -0,0 +1,79 @@ + + + + ETmodelTD + simRun + 1 + + + + eventTree.xml + + + + + statusACC,statusLPI,statusLPR,sequence + ACC + LPI + LPR + sequence + + + + + +

0.5

+
+ + 0. + 10. + +
+ + + + + 10 + + + distrib + + + distrib + + + failTime + + + + + + + eventTreeTest + ET + MC_external + sim_PS + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + + statusACC,statusLPI,statusLPR + OutputPlaceHolder + + + statusACC,statusLPI,statusLPR + sequence + + + +
diff --git a/plugins/PRAplugin/tests/test_FTmodel.xml b/plugins/PRAplugin/tests/test_FTmodel.xml new file mode 100644 index 0000000000..0a2d854bf7 --- /dev/null +++ b/plugins/PRAplugin/tests/test_FTmodel.xml @@ -0,0 +1,80 @@ + + + + FTmodel + simRun + 1 + + + + FT1.xml + + + + + statusBE1,statusBE2,statusBE3,statusBE4,TOP + + TOP + BE1 + BE2 + BE3 + BE4 + + + + + +

0.5

+
+
+ + + + + 10 + + + distrib + + + distrib + + + distrib + + + distrib + + + + + + + faultTreeTest + FT + MC_external + sim_PS + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + + statusBE1,statusBE2,statusBE3,statusBE4 + OutputPlaceHolder + + + statusBE1,statusBE2,statusBE3,statusBE4 + TOP + + + +
diff --git a/plugins/PRAplugin/tests/test_FTmodel_TD.xml b/plugins/PRAplugin/tests/test_FTmodel_TD.xml new file mode 100644 index 0000000000..5bbb2097b5 --- /dev/null +++ b/plugins/PRAplugin/tests/test_FTmodel_TD.xml @@ -0,0 +1,83 @@ + + + + FTmodelTD + simRun + 1 + + + + FT1.xml + + + + + statusA,statusB,statusC,statusD,TOP + TOP + A + B + C + D + + + + + +

0.5

+
+ + 0. + 10. + +
+ + + + + 50 + + + failTime + + + demand + + + failTime + + + demand + + + + + + + faultTreeTest + FT + MC_external + sim_PS + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + + statusA,statusB,statusC,statusD + OutputPlaceHolder + + + statusA,statusB,statusC,statusD + TOP + + + +
diff --git a/plugins/PRAplugin/tests/test_dataClassifier_postprocessor.xml b/plugins/PRAplugin/tests/test_dataClassifier_postprocessor.xml new file mode 100644 index 0000000000..f4e2bef5f2 --- /dev/null +++ b/plugins/PRAplugin/tests/test_dataClassifier_postprocessor.xml @@ -0,0 +1,140 @@ + + + + dataClassifier + simRun,import_ET_PS,classify,printOnFile_ET_PS,printOnFile_sim_PS + 1 + + + + +

0.4

+
+ + 4 + 1 + + + 5 + 1 + +
+ + + + + 10 + + + ACC_distrib + + + LPI_distrib + + + LPR_distrib + + + + + + eventTree.xml + + + + + ACC_status + time_LPI + time_LPR + + + LPI_status + time_LPI + time_LPR + + + LPR_status + time_LPI + time_LPR + + + + + + ACC_status,time_LPI,time_LPR,out,LPI_status,LPR_status + + + OpenPSA + True + + + + + func_ACC + + + func_LPI + + + func_LPR + + + + + + + inputPlaceHolder + PythonModule + MC_external + sim_PS + + + eventTreeTest + ETimporter + ET_PS + + + ET_PS + sim_PS + ET_Classifier + sim_PS + + + ET_PS + Print_ET_PS + + + sim_PS + Print_sim_PS + + + + + + csv + ET_PS + input,output + + + csv + sim_PS + input,output + + + + + + ACC,LPI,LPR + sequence + + + ACC_status,time_LPI,time_LPR + OutputPlaceHolder + + + ACC_status,time_LPI,time_LPR + out, LPI_status, LPR_status + + + +
diff --git a/plugins/PRAplugin/tests/test_dataClassifier_postprocessor_HS.xml b/plugins/PRAplugin/tests/test_dataClassifier_postprocessor_HS.xml new file mode 100644 index 0000000000..2a92a71d51 --- /dev/null +++ b/plugins/PRAplugin/tests/test_dataClassifier_postprocessor_HS.xml @@ -0,0 +1,140 @@ + + + + dataClassifierHS + simRun,import_ET_PS,classify,printOnFile_ET_PS,printOnFile_sim_PS + 1 + + + + +

0.4

+
+ + 4 + 1 + + + 5 + 1 + +
+ + + + + 10 + + + ACC_distrib + + + LPI_distrib + + + LPR_distrib + + + + + + eventTree.xml + + + + + ACC_status + LPI_status + LPR_status + + + ACC_status + LPI_status + LPR_status + + + ACC_status + LPI_status + LPR_status + + + + + + ACC_sim,time_LPI,time_LPR,out,ACC_status,LPI_status,LPR_status,time,temp + + + OpenPSA + True + + + + + func_ACC + + + func_LPI + + + func_LPR + + + + + + + inputPlaceHolder + PythonModule + MC_external + sim_PS + + + eventTreeTest + ETimporter + ET_PS + + + ET_PS + sim_PS + ET_Classifier + sim_PS + + + ET_PS + Print_ET_PS + + + sim_PS + Print_sim_PS + + + + + + csv + ET_PS + input,output + + + csv + sim_PS + input,output + + + + + + ACC,LPI,LPR + sequence + + + ACC_sim,time_LPI,time_LPR + OutputPlaceHolder + + + ACC_sim,time_LPI,time_LPR + time, temp, out, ACC_status, LPI_status, LPR_status + + + +
diff --git a/plugins/PRAplugin/tests/test_ensemblePRAModel_discrete.xml b/plugins/PRAplugin/tests/test_ensemblePRAModel_discrete.xml new file mode 100644 index 0000000000..8e7a645daa --- /dev/null +++ b/plugins/PRAplugin/tests/test_ensemblePRAModel_discrete.xml @@ -0,0 +1,189 @@ + + + + ensembleDiscrete + simRun + 1 + + + + FT1.xml + FT2.xml + eventTree.xml + + + + + eventTree + faultTree1 + faultTree2 + PRA_Model + MC_external + sim_PS + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + +

0.5

+
+ +

0.1

+
+
+ + + + + 100 + + + distrib + + + distrib + + + distribLow + + + distrib + + + distrib + + + distrib + + + distribLow + + + + + + + initEvent,statusACC,statusLPI,LPR,sequence + IE + ACC + LPI + LPR + sequence + + + + A,B,LPR + LPR + A + B + + + + A,C,alpha + alpha + A + C + + + + alpha,beta,gamma,statusACC + RBD + signal + ACC + alpha + beta + gamma + ACC + + + + initialState,statusLPI + initialState + statusLPI + 10 + + 1 + + + 0 + + + + + + ET + eventTree + ET_PS + + + markov + markov_input + markov_PS + + + graph + graph_input + graph_PS + + + FT1 + faultTree1 + FT1_PS + + + FT2 + faultTree2 + FT2_PS + + + + + + + A,B,C,beta,gamma,initialState,initEvent + OutputPlaceHolder + + + A,B,C,beta,gamma,initialState,initEvent + sequence + + + initEvent,statusACC,statusLPI,LPR + sequence + + + A,B + LPR + + + A,C + alpha + + + initialState + OutputPlaceHolder + + + initialState + statusLPI + + + alpha,beta,gamma + OutputPlaceHolder + + + alpha,beta,gamma + statusACC + + + +
diff --git a/plugins/PRAplugin/tests/test_ensemblePRAModel_mixed.xml b/plugins/PRAplugin/tests/test_ensemblePRAModel_mixed.xml new file mode 100644 index 0000000000..1ae9f42190 --- /dev/null +++ b/plugins/PRAplugin/tests/test_ensemblePRAModel_mixed.xml @@ -0,0 +1,193 @@ + + + + ensembleMixed + simRun + 1 + + + + FT1.xml + FT2.xml + eventTree.xml + + + + + eventTree + faultTree1 + faultTree2 + PRA_Model + MC_external + sim_PS + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + +

0.5

+
+ +

0.1

+
+ + 0. + 10. + +
+ + + + + 100 + + + distrib + + + failTime + + + failTime + + + failTime + + + distrib + + + distrib + + + distribLow + + + + + + + initEvent,statusACC,statusLPI,LPR,sequence + IE + ACC + LPI + LPR + sequence + + + + A,B,LPR + LPR + A + B + + + + A,C,alpha + alpha + A + C + + + + alpha,beta,gamma,statusACC + RBD + signal + ACC + alpha + beta + gamma + ACC + + + + initialState,statusLPI + initialState + statusLPI + 10 + + 1 + + + 0 + + + + + + ET + eventTree + ET_PS + + + markov + markov_input + markov_PS + + + graph + graph_input + graph_PS + + + FT1 + faultTree1 + FT1_PS + + + FT2 + faultTree2 + FT2_PS + + + + + + + A,B,C,beta,gamma,initialState,initEvent + OutputPlaceHolder + + + A,B,C,beta,gamma,initialState,initEvent + sequence,statusLPI,statusACC,LPR + + + initEvent,statusACC,statusLPI,LPR + sequence + + + A,B + LPR + + + A,C + alpha + + + initialState + OutputPlaceHolder + + + initialState + statusLPI + + + alpha,beta,gamma + OutputPlaceHolder + + + alpha,beta,gamma + statusACC + + + +
diff --git a/plugins/PRAplugin/tests/test_graphModel.xml b/plugins/PRAplugin/tests/test_graphModel.xml new file mode 100644 index 0000000000..149bb387c9 --- /dev/null +++ b/plugins/PRAplugin/tests/test_graphModel.xml @@ -0,0 +1,81 @@ + + + + graphModel + simRun + 1 + + + + +

0.5

+
+
+ + + + + 20 + + + distrib + + + distrib + + + distrib + + + distrib + + + + + + + status2,status3,status4,status5,statusSG1,statusSG2,statusSG3 + + graphTest + CST + SG1,SG2,SG3 + 2 + 3 + 4 + 5 + SG1 + SG2 + SG3 + + + + + + inputPlaceHolder + graph + MC_external + sim_PS + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + + status2,status3,status4,status5 + OutputPlaceHolder + + + status2,status3,status4,status5 + statusSG1,statusSG2,statusSG3 + + + +
diff --git a/plugins/PRAplugin/tests/test_graphModel_TD.xml b/plugins/PRAplugin/tests/test_graphModel_TD.xml new file mode 100644 index 0000000000..39967326e4 --- /dev/null +++ b/plugins/PRAplugin/tests/test_graphModel_TD.xml @@ -0,0 +1,82 @@ + + + + graphModelTD + simRun + 1 + + + + +

0.5

+
+ + 0. + 10. + +
+ + + + + 100 + + + demand + + + failTime + + + demand + + + failTime + + + + + + + statusA,statusB,statusC,statusD,statusOUT + graphTestTD + in + out + A + B + C + D + out + + + + + + inputPlaceHolder + graph + MC_external + sim_PS + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + + statusA,statusB,statusC,statusD + OutputPlaceHolder + + + statusA,statusB,statusC,statusD + statusOUT + + + +
diff --git a/plugins/PRAplugin/tests/test_markovModel_2states.xml b/plugins/PRAplugin/tests/test_markovModel_2states.xml new file mode 100644 index 0000000000..6c097764a5 --- /dev/null +++ b/plugins/PRAplugin/tests/test_markovModel_2states.xml @@ -0,0 +1,93 @@ + + + + markovModel_2states + simRun,plot + 1 + + + + + 1.0 + 0.0 + 0.0 + + + + + + + 100 + + + InitialStateDist + + + + + + + initialState,finalState + initialState + finalState + 500 + + 2 + + + 1 + + + + + + + inputPlaceHolder + markov2 + MC_external + sim_PS + + + sim_PS + hist + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + histogram + sim_PS|Output|finalState + True + + finalState + pdf + + + png + + <text>Test MarkovModel</text> + + + + + + + + initialState + OutputPlaceHolder + + + initialState + finalState + + + + diff --git a/plugins/PRAplugin/tests/test_markovModel_2states_tau.xml b/plugins/PRAplugin/tests/test_markovModel_2states_tau.xml new file mode 100644 index 0000000000..e24af0a05f --- /dev/null +++ b/plugins/PRAplugin/tests/test_markovModel_2states_tau.xml @@ -0,0 +1,93 @@ + + + + markovModel_2states_tau + simRun,plot + 1 + + + + + 1.0 + 0.0 + 0.0 + + + + + + + 100 + + + InitialStateDist + + + + + + + initialState,finalState + initialState + finalState + 10 + + 2 + + + 1 + + + + + + + inputPlaceHolder + markov2 + MC_external + sim_PS + + + sim_PS + hist + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + histogram + sim_PS|Output|finalState + True + + finalState + pdf + + + png + + <text>Test MarkovModel</text> + + + + + + + + initialState + OutputPlaceHolder + + + initialState + finalState + + + + diff --git a/plugins/PRAplugin/tests/test_markovModel_3states.xml b/plugins/PRAplugin/tests/test_markovModel_3states.xml new file mode 100644 index 0000000000..7a94f256fb --- /dev/null +++ b/plugins/PRAplugin/tests/test_markovModel_3states.xml @@ -0,0 +1,99 @@ + + + + markovModel_3states + simRun,plot + 1 + + + + + 1.0 + 0.0 + 0.0 + + + + + + + 100 + + + InitialStateDist + + + + + + + initialState,finalState + initialState + finalState + 1000 + + 2 + 3 + + + 1 + 3 + + + 1 + 2 + + + + + + + inputPlaceHolder + markov + MC_external + sim_PS + + + sim_PS + hist + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + histogram + sim_PS|Output|finalState + True + + finalState + pdf + + + png + + <text>Test MarkovModel</text> + + + + + + + + initialState + OutputPlaceHolder + + + initialState + finalState + + + + diff --git a/plugins/PRAplugin/tests/test_markovModel_3states_complexTrans.xml b/plugins/PRAplugin/tests/test_markovModel_3states_complexTrans.xml new file mode 100644 index 0000000000..3a90237a09 --- /dev/null +++ b/plugins/PRAplugin/tests/test_markovModel_3states_complexTrans.xml @@ -0,0 +1,93 @@ + + + + markovModel_3states_complexTrans + simRun,plot + 1 + + + + + 1.0 + 0.0 + 0.0 + + + + + + + 100 + + + InitialStateDist + + + + + + + initialState,finalState + initialState + finalState + 1000 + + 2 + + + 1 + + + + + + + inputPlaceHolder + markov + MC_external + sim_PS + + + sim_PS + hist + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + histogram + sim_PS|Output|finalState + True + + finalState + pdf + + + png + + <text>Test MarkovModel</text> + + + + + + + + initialState + OutputPlaceHolder + + + initialState + finalState + + + + diff --git a/plugins/PRAplugin/tests/test_markovModel_3states_instantTrans.xml b/plugins/PRAplugin/tests/test_markovModel_3states_instantTrans.xml new file mode 100644 index 0000000000..0f5aab3258 --- /dev/null +++ b/plugins/PRAplugin/tests/test_markovModel_3states_instantTrans.xml @@ -0,0 +1,93 @@ + + + + markovModel_3states_instantTrans + simRun,plot + 1 + + + + + 1.0 + 0.0 + 0.0 + + + + + + + 100 + + + InitialStateDist + + + + + + + initialState,finalState + initialState + finalState + 200 + + 2 + + + 1 + + + + + + + inputPlaceHolder + markov + MC_external + sim_PS + + + sim_PS + hist + Print_sim_PS + + + + + + csv + sim_PS + input,output + + + + + histogram + sim_PS|Output|finalState + True + + finalState + pdf + + + png + + <text>Test MarkovModel</text> + + + + + + + + initialState + OutputPlaceHolder + + + initialState + finalState + + + + diff --git a/plugins/PRAplugin/tests/tests b/plugins/PRAplugin/tests/tests new file mode 100644 index 0000000000..7d98aa94d1 --- /dev/null +++ b/plugins/PRAplugin/tests/tests @@ -0,0 +1,93 @@ +[Tests] + + [./TestGraphModel] + type = 'RavenFramework' + input = 'test_graphModel.xml' + UnorderedCsv = 'graphModel/Print_sim_PS.csv' + [../] + + [./TestMarkovModel_2states] + type = 'RavenFramework' + input = 'test_markovModel_2states.xml' + UnorderedCsv = 'markovModel_2states/Print_sim_PS.csv' + [../] + + [./TestMarkovModel_3states] + type = 'RavenFramework' + input = 'test_markovModel_3states.xml' + UnorderedCsv = 'markovModel_3states/Print_sim_PS.csv' + [../] + + [./TestMarkovModel_3states_complexTrans] + type = 'RavenFramework' + input = 'test_markovModel_3states_complexTrans.xml' + UnorderedCsv = 'markovModel_3states_complexTrans/Print_sim_PS.csv' + [../] + + [./TestMarkovModel_3states_instantTrans] + type = 'RavenFramework' + input = 'test_markovModel_3states_instantTrans.xml' + UnorderedCsv = 'markovModel_3states_instantTrans/Print_sim_PS.csv' + [../] + + [./TestMarkovModel_2states_tau] + type = 'RavenFramework' + input = 'test_markovModel_2states_tau.xml' + UnorderedCsv = 'markovModel_2states_tau/Print_sim_PS.csv' + [../] + + [./TestETModel] + type = 'RavenFramework' + input = 'test_ETmodel.xml' + UnorderedCsv = 'ETmodel/Print_sim_PS.csv' + [../] + + [./TestFTModel] + type = 'RavenFramework' + input = 'test_FTmodel.xml' + UnorderedCsv = 'FTmodel/Print_sim_PS.csv' + [../] + + [./TestGraphModelTD] + type = 'RavenFramework' + input = 'test_graphModel_TD.xml' + UnorderedCsv = 'graphModelTD/Print_sim_PS.csv' + [../] + + [./TestFTModelTD] + type = 'RavenFramework' + input = 'test_FTmodel_TD.xml' + UnorderedCsv = 'FTmodelTD/Print_sim_PS.csv' + [../] + + [./TestETModelTD] + type = 'RavenFramework' + input = 'test_ETmodel_TD.xml' + UnorderedCsv = 'ETmodelTD/Print_sim_PS.csv' + [../] + + [./TestEnesembleDiscrete] + type = 'RavenFramework' + input = 'test_ensemblePRAModel_discrete.xml' + UnorderedCsv = 'ensembleDiscrete/Print_sim_PS.csv' + [../] + + [./TestEnesembleMixed] + type = 'RavenFramework' + input = 'test_ensemblePRAModel_mixed.xml' + UnorderedCsv = 'ensembleMixed/Print_sim_PS.csv' + [../] + + [./TestDataClassifierPS] + type = 'RavenFramework' + input = 'test_dataClassifier_postprocessor.xml' + UnorderedCsv = 'dataClassifier/Print_sim_PS.csv dataClassifier/Print_ET_PS.csv' + [../] + + [./TestDataClassifierHS] + type = 'RavenFramework' + input = 'test_dataClassifier_postprocessor_HS.xml' + UnorderedCsv = 'dataClassifierHS/Print_sim_PS_0.csv dataClassifierHS/Print_sim_PS_3.csv dataClassifierHS/Print_sim_PS_7.csv dataClassifierHS/Print_sim_PS_9.csv' + [../] +[] + diff --git a/tests/crow/test_utils.py b/tests/crow/test_utils.py index 4e6dc40f35..b298675777 100644 --- a/tests/crow/test_utils.py +++ b/tests/crow/test_utils.py @@ -32,3 +32,4 @@ """ + diff --git a/tests/framework/Distributions/gold/test_markov/Grid_dump.csv b/tests/framework/Distributions/gold/test_markov/Grid_dump.csv new file mode 100644 index 0000000000..2229da565c --- /dev/null +++ b/tests/framework/Distributions/gold/test_markov/Grid_dump.csv @@ -0,0 +1,4 @@ +x,y +1.0,2.0 +2.0,4.0 +4.0,8.0 diff --git a/tests/framework/Distributions/gold/test_markov/MC_dump.csv b/tests/framework/Distributions/gold/test_markov/MC_dump.csv new file mode 100644 index 0000000000..7d91aa9585 --- /dev/null +++ b/tests/framework/Distributions/gold/test_markov/MC_dump.csv @@ -0,0 +1,101 @@ +x,y +1.0,2.0 +2.0,4.0 +2.0,4.0 +4.0,8.0 +2.0,4.0 +2.0,4.0 +4.0,8.0 +4.0,8.0 +4.0,8.0 +4.0,8.0 +1.0,2.0 +1.0,2.0 +1.0,2.0 +1.0,2.0 +4.0,8.0 +4.0,8.0 +4.0,8.0 +1.0,2.0 +4.0,8.0 +1.0,2.0 +1.0,2.0 +1.0,2.0 +2.0,4.0 +2.0,4.0 +4.0,8.0 +1.0,2.0 +4.0,8.0 +4.0,8.0 +1.0,2.0 +1.0,2.0 +2.0,4.0 +1.0,2.0 +2.0,4.0 +2.0,4.0 +1.0,2.0 +1.0,2.0 +4.0,8.0 +1.0,2.0 +4.0,8.0 +1.0,2.0 +1.0,2.0 +4.0,8.0 +2.0,4.0 +4.0,8.0 +1.0,2.0 +4.0,8.0 +1.0,2.0 +1.0,2.0 +4.0,8.0 +1.0,2.0 +2.0,4.0 +2.0,4.0 +1.0,2.0 +2.0,4.0 +4.0,8.0 +4.0,8.0 +1.0,2.0 +2.0,4.0 +2.0,4.0 +2.0,4.0 +4.0,8.0 +2.0,4.0 +2.0,4.0 +4.0,8.0 +4.0,8.0 +1.0,2.0 +1.0,2.0 +2.0,4.0 +4.0,8.0 +2.0,4.0 +4.0,8.0 +1.0,2.0 +1.0,2.0 +1.0,2.0 +4.0,8.0 +4.0,8.0 +2.0,4.0 +2.0,4.0 +4.0,8.0 +1.0,2.0 +1.0,2.0 +4.0,8.0 +1.0,2.0 +4.0,8.0 +1.0,2.0 +4.0,8.0 +2.0,4.0 +4.0,8.0 +2.0,4.0 +4.0,8.0 +2.0,4.0 +1.0,2.0 +1.0,2.0 +1.0,2.0 +2.0,4.0 +2.0,4.0 +1.0,2.0 +4.0,8.0 +2.0,4.0 +2.0,4.0 diff --git a/tests/framework/Distributions/test_distributionsMarkov.xml b/tests/framework/Distributions/test_distributionsMarkov.xml new file mode 100644 index 0000000000..092aef665f --- /dev/null +++ b/tests/framework/Distributions/test_distributionsMarkov.xml @@ -0,0 +1,137 @@ + + + + framework.Distributions.categorical + wangc + 2018-01-24 + Distributions.Markov + + This test is aimed to test the capability of RAVEN to use 1D Markov Categorical distributions. + + + + + test_markov + MCrun,GridRun,OutStreams + 1 + + + + transition.csv + + + + + inputPlaceHolder + PythonModule + MC + PointSet_MC + + + inputPlaceHolder + PythonModule + Grid + PointSet_Grid + + + PointSet_MC + PointSet_Grid + MC_dump + Grid_dump + plotXMC + plotYMC + + + + + + x,y + + + + + + + -1.1 0.8 0.7 + 0.8 -1.4 0.2 + 0.3 0.6 -0.9 + + + + + + + + + + + 100 + 1234 + + + x_dist + + + + + x_dist + 1.0 2.0 4.0 + + + + + + + csv + PointSet_MC + input,output + + + csv + PointSet_Grid + input,output + + + + + histogram + PointSet_MC|Input|x + 30 + + x + + + pdf + + + + + + histogram + PointSet_MC|Output|y + 30 + + x + + + pdf + + + + + + + x + OutputPlaceHolder + + + x + y + + + x + y + + + + diff --git a/tests/framework/Distributions/test_markov/simple.py b/tests/framework/Distributions/test_markov/simple.py new file mode 100644 index 0000000000..e248d19d3f --- /dev/null +++ b/tests/framework/Distributions/test_markov/simple.py @@ -0,0 +1,18 @@ +# Copyright 2017 Battelle Energy Alliance, LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +import math +import numpy + +def run(self, Input): + self.y = 2.0*self.x diff --git a/tests/framework/Distributions/tests b/tests/framework/Distributions/tests index abd197fe09..0685d4f324 100644 --- a/tests/framework/Distributions/tests +++ b/tests/framework/Distributions/tests @@ -17,6 +17,12 @@ csv = 'test_geometric/MC_dump.csv' [../] + [./MarkovCategorical] + type = 'RavenFramework' + input = 'test_distributionsMarkov.xml' + csv = 'test_markov/Grid_dump.csv test_markov/MC_dump.csv' + [../] + [./logUniform] type = 'RavenFramework' input = 'test_distributionsLogUniform.xml' @@ -28,5 +34,5 @@ input = 'test_distributionsLog10Uniform.xml' csv = 'test_log10Uniform/MC_dump.csv' [../] - + [] diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporterExpand/eventTree.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporterExpand/eventTree.xml new file mode 100644 index 0000000000..b9c5c08e63 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporterExpand/eventTree.xml @@ -0,0 +1,33 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches/eventTree.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches/eventTree.xml new file mode 100644 index 0000000000..405c275239 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches/eventTree.xml @@ -0,0 +1,41 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches_NewNumbering/eventTree.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches_NewNumbering/eventTree.xml new file mode 100644 index 0000000000..5744a8a429 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches_NewNumbering/eventTree.xml @@ -0,0 +1,41 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches_NewNumbering_expanded/eventTree.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches_NewNumbering_expanded/eventTree.xml new file mode 100644 index 0000000000..5744a8a429 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/ETimporter_3branches_NewNumbering_expanded/eventTree.xml @@ -0,0 +1,41 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter/PrintPS.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter/PrintPS.xml deleted file mode 100644 index 0c04cb3073..0000000000 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter/PrintPS.xml +++ /dev/null @@ -1 +0,0 @@ -ACC,LPR,LPIsequencePrintPS.csv diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterCoupledET/PrintPS.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterCoupledET/PrintPS.xml deleted file mode 100644 index baff3a9e63..0000000000 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterCoupledET/PrintPS.xml +++ /dev/null @@ -1 +0,0 @@ -ACC,HPI,LPR,LPIsequencePrintPS.csv diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterDefineBranch/PrintPS.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterDefineBranch/PrintPS.xml deleted file mode 100644 index 7a92115214..0000000000 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterDefineBranch/PrintPS.xml +++ /dev/null @@ -1 +0,0 @@ -ACC,REC2,REC1,LPR,LPIsequencePrintPS.csv diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterExpand/PrintPS.csv b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterExpand/PrintPS.csv new file mode 100644 index 0000000000..df727669f7 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterExpand/PrintPS.csv @@ -0,0 +1,9 @@ +ACC,LPR,LPI,sequence +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0 +0.0,0.0,1.0,2.0 +1.0,0.0,0.0,3.0 +1.0,0.0,1.0,3.0 +0.0,1.0,1.0,2.0 +1.0,1.0,0.0,3.0 +1.0,1.0,1.0,3.0 diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterSymbolicSequence/PrintPS.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterSymbolicSequence/PrintPS.xml deleted file mode 100644 index 0c04cb3073..0000000000 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporterSymbolicSequence/PrintPS.xml +++ /dev/null @@ -1 +0,0 @@ -ACC,LPR,LPIsequencePrintPS.csv diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches/PrintPS.csv b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches/PrintPS.csv new file mode 100644 index 0000000000..6c31e843c9 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches/PrintPS.csv @@ -0,0 +1,7 @@ +ACC,LPR,LPI,sequence +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0 +0.0,2.0,0.0,4.0 +0.0,-1.0,1.0,2.0 +1.0,-1.0,-1.0,3.0 +2.0,-1.0,-1.0,5.0 diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches_NewNumbering/PrintPS.csv b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches_NewNumbering/PrintPS.csv new file mode 100644 index 0000000000..80183c24d9 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches_NewNumbering/PrintPS.csv @@ -0,0 +1,7 @@ +ACC,LPR,LPI,sequence +0.0,2.0,0.0,0.0 +0.0,3.0,0.0,1.0 +0.0,0.0,0.0,4.0 +0.0,-1.0,1.0,2.0 +1.0,-1.0,-1.0,3.0 +2.0,-1.0,-1.0,5.0 diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches_NewNumbering_expanded/PrintPS.csv b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches_NewNumbering_expanded/PrintPS.csv new file mode 100644 index 0000000000..776bf69309 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/gold/ETimporter_3branches_NewNumbering_expanded/PrintPS.csv @@ -0,0 +1,19 @@ +ACC,LPR,LPI,sequence +0.0,2.0,0.0,0.0 +0.0,3.0,0.0,1.0 +0.0,0.0,0.0,4.0 +0.0,2.0,1.0,2.0 +1.0,2.0,0.0,3.0 +2.0,2.0,0.0,5.0 +1.0,2.0,1.0,3.0 +2.0,2.0,1.0,5.0 +0.0,3.0,1.0,2.0 +0.0,0.0,1.0,2.0 +1.0,3.0,0.0,3.0 +1.0,0.0,0.0,3.0 +2.0,3.0,0.0,5.0 +2.0,0.0,0.0,5.0 +1.0,3.0,1.0,3.0 +1.0,0.0,1.0,3.0 +2.0,3.0,1.0,5.0 +2.0,0.0,1.0,5.0 diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter.xml index adf858297a..6d4847e2e4 100644 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter.xml +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter.xml @@ -1,6 +1,6 @@ - framework/PostProcessors/ETimporterPostProcessor + framework/PostProcessors/ETimporterPostProcessor.ET_importer_OpenPSA mandd 2017-11-07 ETimporter @@ -23,6 +23,7 @@ OpenPSA + False diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporterMultipleET.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporterMultipleET.xml index c9872cbe03..7b331adac5 100644 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporterMultipleET.xml +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporterMultipleET.xml @@ -1,6 +1,6 @@ - framework/PostProcessors.ETimporterPostProcessorMultipleET + framework/PostProcessors.ETimporterPostProcessorMultipleET.ET_importer_OpenPSA_coupledET mandd 2017-11-07 ETimporter @@ -26,6 +26,7 @@ OpenPSA + False diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporterSymbolic.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporterSymbolic.xml index a5340453be..043f529475 100644 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporterSymbolic.xml +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporterSymbolic.xml @@ -1,6 +1,6 @@ - framework/PostProcessors.ETImporterPostProcessorSymbolic + framework/PostProcessors/ETimporterPostProcessor.ET_importer_OpenPSA_Symbolic mandd 2017-11-07 ETimporter @@ -25,6 +25,7 @@ OpenPSA + False diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches.xml new file mode 100644 index 0000000000..da00fcf810 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/ETimporterPostProcessor.ET_importer_3branches + mandd + 2017-11-07 + ETimporter + + Tests of the ETImporter post-processor: it read an event-tree (ET) from an .xml file (eventTree.xml) and it imports + the ET structure into a PointSet. Note that the ET needs to be in an OpenPSA format. + + + + + ETimporter_3branches + import,printOnFile + 1 + + + + eventTree.xml + + + + + OpenPSA + False + + + + + + eventTreeTest + ETimporter + ET_PS + + + ET_PS + PrintPS + + + + + + csv + ET_PS + + + + + + ACC,LPI,LPR + sequence + + + + diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches_NewNumbering.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches_NewNumbering.xml new file mode 100644 index 0000000000..b1d1689764 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches_NewNumbering.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/ETimporterPostProcessor.ETimporter_3branches_NewNumbering + mandd + 2017-11-07 + ETimporter + + Tests of the ETImporter post-processor: it read an event-tree (ET) from an .xml file (eventTree.xml) and it imports + the ET structure into a PointSet. Note that the ET needs to be in an OpenPSA format. + + + + + ETimporter_3branches_NewNumbering + import,printOnFile + 1 + + + + eventTree.xml + + + + + OpenPSA + False + + + + + + eventTreeTest + ETimporter + ET_PS + + + ET_PS + PrintPS + + + + + + csv + ET_PS + + + + + + ACC,LPI,LPR + sequence + + + + diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches_NewNumbering_expanded.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches_NewNumbering_expanded.xml new file mode 100644 index 0000000000..fbbe254b6b --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_3branches_NewNumbering_expanded.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/ETimporterPostProcessor.ETimporter_3branches_NewNumbering_expanded + mandd + 2017-11-07 + ETimporter + + Tests of the ETImporter post-processor: it read an event-tree (ET) from an .xml file (eventTree.xml) and it imports + the ET structure into a PointSet. Note that the ET needs to be in an OpenPSA format. + + + + + ETimporter_3branches_NewNumbering_expanded + import,printOnFile + 1 + + + + eventTree.xml + + + + + OpenPSA + True + + + + + + eventTreeTest + ETimporter + ET_PS + + + ET_PS + PrintPS + + + + + + csv + ET_PS + + + + + + ACC,LPI,LPR + sequence + + + + diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_DefineBranch.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_DefineBranch.xml index 6f0dba56d4..e630dc0112 100644 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_DefineBranch.xml +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_DefineBranch.xml @@ -1,6 +1,6 @@ - framework/PostProcessors/ETimporterPostProcessorDefineBranch + framework/PostProcessors/ETimporterPostProcessor.ET_importer_OpenPSA_Define_Branch mandd 2017-11-07 ETimporter @@ -24,6 +24,7 @@ OpenPSA + False diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_expand.xml b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_expand.xml new file mode 100644 index 0000000000..eb790c7248 --- /dev/null +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/test_ETimporter_expand.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/ETimporterPostProcessor.ET_importer_OpenPSA_Expand + mandd + 2017-11-07 + ETimporter + + Tests of the ETImporter post-processor: it read an event-tree (ET) from an .xml file (eventTree.xml) and it imports + the ET structure into a PointSet. Note that the ET needs to be in an OpenPSA format. + + + + + ETimporterExpand + import,printOnFile + 1 + + + + eventTree.xml + + + + + OpenPSA + True + + + + + + eventTreeTest + ETimporter + ET_PS + + + ET_PS + PrintPS + + + + + + csv + ET_PS + + + + + + ACC,LPI,LPR + sequence + + + + diff --git a/tests/framework/PostProcessors/ETimporterPostProcessor/tests b/tests/framework/PostProcessors/ETimporterPostProcessor/tests index 3ef50c63d9..0c779a9de4 100644 --- a/tests/framework/PostProcessors/ETimporterPostProcessor/tests +++ b/tests/framework/PostProcessors/ETimporterPostProcessor/tests @@ -3,24 +3,43 @@ type = 'RavenFramework' input = 'test_ETimporter.xml' csv = 'ETimporter/PrintPS.csv' - output = 'ETimporter/PrintPS.xml' [../] [./ET_importer_OpenPSA_Symbolic] type = 'RavenFramework' input = 'test_ETimporterSymbolic.xml' csv = 'ETimporterSymbolicSequence/PrintPS.csv' - output = 'ETimporterSymbolicSequence/PrintPS.xml ETimporterSymbolicSequence/eventTreeMandd_mapping.xml' + output = 'ETimporterSymbolicSequence/eventTreeMandd_mapping.xml' [../] [./ET_importer_OpenPSA_Define_Branch] type = 'RavenFramework' input = 'test_ETimporter_DefineBranch.xml' csv = 'ETimporterDefineBranch/PrintPS.csv' - output = 'ETimporterDefineBranch/PrintPS.xml ETimporterDefineBranch/eventTreeMandd_mapping.xml' + output = 'ETimporterDefineBranch/eventTreeMandd_mapping.xml' [../] [./ET_importer_OpenPSA_coupledET] type = 'RavenFramework' input = 'test_ETimporterMultipleET.xml' csv = 'ETimporterCoupledET/PrintPS.csv' - output = 'ETimporterCoupledET/PrintPS.xml ETimporterCoupledET/eventTreeMain_mapping.xml' + output = 'ETimporterCoupledET/eventTreeMain_mapping.xml' + [../] + [./ET_importer_OpenPSA_expand] + type = 'RavenFramework' + input = 'test_ETimporter_expand.xml' + csv = 'ETimporterExpand/PrintPS.csv' + [../] + [./ET_importer_3branches] + type = 'RavenFramework' + input = 'test_ETimporter_3branches.xml' + csv = 'ETimporter_3branches/PrintPS.csv' + [../] + [./ETimporter_3branches_NewNumbering] + type = 'RavenFramework' + input = 'test_ETimporter_3branches_NewNumbering.xml' + csv = 'ETimporter_3branches_NewNumbering/PrintPS.csv' + [../] + [./ETimporter_3branches_NewNumbering_expanded] + type = 'RavenFramework' + input = 'test_ETimporter_3branches_NewNumbering_expanded.xml' + csv = 'ETimporter_3branches_NewNumbering_expanded/PrintPS.csv' [../] [] diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and/FT_and.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and/FT_and.xml new file mode 100644 index 0000000000..cef336ee12 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and/FT_and.xml @@ -0,0 +1,12 @@ + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT/FT_and_NOT.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT/FT_and_NOT.xml new file mode 100644 index 0000000000..78bfc3416d --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT/FT_and_NOT.xml @@ -0,0 +1,16 @@ + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT_embedded/FT_and_NOT_embedded.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT_embedded/FT_and_NOT_embedded.xml new file mode 100644 index 0000000000..7b831c35a0 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT_embedded/FT_and_NOT_embedded.xml @@ -0,0 +1,13 @@ + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT_withNOT_embedded/FT_and_withNOT_withNOT_embedded.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT_withNOT_embedded/FT_and_withNOT_withNOT_embedded.xml new file mode 100644 index 0000000000..872b2e99ae --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_and_withNOT_withNOT_embedded/FT_and_withNOT_withNOT_embedded.xml @@ -0,0 +1,18 @@ + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_atleast/FT_atleast.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_atleast/FT_atleast.xml new file mode 100644 index 0000000000..a85862fb94 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_atleast/FT_atleast.xml @@ -0,0 +1,12 @@ + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_cardinality/FT_cardinality.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_cardinality/FT_cardinality.xml new file mode 100644 index 0000000000..f7618ca5c1 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_cardinality/FT_cardinality.xml @@ -0,0 +1,13 @@ + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_component/FT1.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_component/FT1.xml new file mode 100644 index 0000000000..61d3140094 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_component/FT1.xml @@ -0,0 +1,47 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_doubleNot/FTimporter_doubleNot.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_doubleNot/FTimporter_doubleNot.xml new file mode 100644 index 0000000000..4eebbe1867 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_doubleNot/FTimporter_doubleNot.xml @@ -0,0 +1,11 @@ + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_iff/FT_iff.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_iff/FT_iff.xml new file mode 100644 index 0000000000..e9c8250711 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_iff/FT_iff.xml @@ -0,0 +1,10 @@ + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_imply/FT_imply.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_imply/FT_imply.xml new file mode 100644 index 0000000000..c4c88dc2e7 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_imply/FT_imply.xml @@ -0,0 +1,10 @@ + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_model_data.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_model_data.xml new file mode 100644 index 0000000000..ec0bfcb7d9 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_model_data.xml @@ -0,0 +1,21 @@ + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_one.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_one.xml new file mode 100644 index 0000000000..2327ebc60a --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_one.xml @@ -0,0 +1,16 @@ + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_two.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_two.xml new file mode 100644 index 0000000000..b99b1f5288 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_multipleFTs/trans_two.xml @@ -0,0 +1,19 @@ + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_nand/FT_nand.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_nand/FT_nand.xml new file mode 100644 index 0000000000..aaf5e0f875 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_nand/FT_nand.xml @@ -0,0 +1,11 @@ + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_nor/FT_nor.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_nor/FT_nor.xml new file mode 100644 index 0000000000..b31bfd6b33 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_nor/FT_nor.xml @@ -0,0 +1,11 @@ + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_not/FTimporter_not.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_not/FTimporter_not.xml new file mode 100644 index 0000000000..d899c9b693 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_not/FTimporter_not.xml @@ -0,0 +1,9 @@ + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_or/FT_or.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_or/FT_or.xml new file mode 100644 index 0000000000..00dbf42843 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_or/FT_or.xml @@ -0,0 +1,11 @@ + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_or_houseEvent/FT_or_houseEvent.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_or_houseEvent/FT_or_houseEvent.xml new file mode 100644 index 0000000000..d0b983638c --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_or_houseEvent/FT_or_houseEvent.xml @@ -0,0 +1,14 @@ + + + + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_xor/FT_xor.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_xor/FT_xor.xml new file mode 100644 index 0000000000..20b096014b --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/FTimporter_xor/FT_xor.xml @@ -0,0 +1,11 @@ + + + + + + + + + + + \ No newline at end of file diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and/PrintPS.csv new file mode 100644 index 0000000000..642917d42f --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and/PrintPS.csv @@ -0,0 +1,9 @@ +BE2,BE3,BE1,TOP +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,0.0 +1.0,0.0,0.0,0.0 +1.0,1.0,0.0,0.0 +0.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0 +1.0,0.0,1.0,0.0 +1.0,1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT/PrintPS.csv new file mode 100644 index 0000000000..ebbb161fed --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT/PrintPS.csv @@ -0,0 +1,9 @@ +BE2,BE3,BE1,TOP +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,0.0 +1.0,0.0,0.0,0.0 +1.0,1.0,0.0,0.0 +0.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0 +1.0,0.0,1.0,1.0 +1.0,1.0,1.0,0.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT_embedded/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT_embedded/PrintPS.csv new file mode 100644 index 0000000000..ebbb161fed --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT_embedded/PrintPS.csv @@ -0,0 +1,9 @@ +BE2,BE3,BE1,TOP +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,0.0 +1.0,0.0,0.0,0.0 +1.0,1.0,0.0,0.0 +0.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0 +1.0,0.0,1.0,1.0 +1.0,1.0,1.0,0.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT_withNOT_embedded/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT_withNOT_embedded/PrintPS.csv new file mode 100644 index 0000000000..642917d42f --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_and_withNOT_withNOT_embedded/PrintPS.csv @@ -0,0 +1,9 @@ +BE2,BE3,BE1,TOP +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,0.0 +1.0,0.0,0.0,0.0 +1.0,1.0,0.0,0.0 +0.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0 +1.0,0.0,1.0,0.0 +1.0,1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_atleast/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_atleast/PrintPS.csv new file mode 100644 index 0000000000..0df8bb7962 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_atleast/PrintPS.csv @@ -0,0 +1,17 @@ +BE2,BE3,BE1,BE4,TOP +0.0,0.0,0.0,0.0,0.0 +0.0,0.0,0.0,1.0,0.0 +0.0,1.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0 +1.0,0.0,0.0,1.0,1.0 +1.0,1.0,0.0,0.0,1.0 +1.0,1.0,0.0,1.0,1.0 +0.0,0.0,1.0,0.0,0.0 +0.0,0.0,1.0,1.0,1.0 +0.0,1.0,1.0,0.0,1.0 +0.0,1.0,1.0,1.0,1.0 +1.0,0.0,1.0,0.0,1.0 +1.0,0.0,1.0,1.0,1.0 +1.0,1.0,1.0,0.0,1.0 +1.0,1.0,1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_cardinality/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_cardinality/PrintPS.csv new file mode 100644 index 0000000000..c173726e74 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_cardinality/PrintPS.csv @@ -0,0 +1,33 @@ +BE2,BE3,BE1,BE4,BE5,TOP +0.0,0.0,0.0,0.0,0.0,0.0 +0.0,0.0,0.0,0.0,1.0,0.0 +0.0,0.0,0.0,1.0,0.0,0.0 +0.0,0.0,0.0,1.0,1.0,1.0 +0.0,1.0,0.0,0.0,0.0,0.0 +0.0,1.0,0.0,0.0,1.0,1.0 +0.0,1.0,0.0,1.0,0.0,1.0 +0.0,1.0,0.0,1.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0,0.0 +1.0,0.0,0.0,0.0,1.0,1.0 +1.0,0.0,0.0,1.0,0.0,1.0 +1.0,0.0,0.0,1.0,1.0,1.0 +1.0,1.0,0.0,0.0,0.0,1.0 +1.0,1.0,0.0,0.0,1.0,1.0 +1.0,1.0,0.0,1.0,0.0,1.0 +1.0,1.0,0.0,1.0,1.0,0.0 +0.0,0.0,1.0,0.0,0.0,0.0 +0.0,0.0,1.0,0.0,1.0,1.0 +0.0,0.0,1.0,1.0,0.0,1.0 +0.0,0.0,1.0,1.0,1.0,1.0 +0.0,1.0,1.0,0.0,0.0,1.0 +0.0,1.0,1.0,0.0,1.0,1.0 +0.0,1.0,1.0,1.0,0.0,1.0 +0.0,1.0,1.0,1.0,1.0,0.0 +1.0,0.0,1.0,0.0,0.0,1.0 +1.0,0.0,1.0,0.0,1.0,1.0 +1.0,0.0,1.0,1.0,0.0,1.0 +1.0,0.0,1.0,1.0,1.0,0.0 +1.0,1.0,1.0,0.0,0.0,1.0 +1.0,1.0,1.0,0.0,1.0,0.0 +1.0,1.0,1.0,1.0,0.0,0.0 +1.0,1.0,1.0,1.0,1.0,0.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_component/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_component/PrintPS.csv new file mode 100644 index 0000000000..c04d86ec23 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_component/PrintPS.csv @@ -0,0 +1,65 @@ +BE2,BE3,BE1,BE4,TOP +0.0,0.0,0.0,0.0,0.0 +0.0,0.0,0.0,1.0,0.0 +0.0,0.0,1.0,0.0,0.0 +0.0,0.0,1.0,1.0,1.0 +0.0,1.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0,1.0 +0.0,1.0,1.0,1.0,1.0 +0.0,0.0,0.0,0.0,0.0 +0.0,0.0,0.0,1.0,0.0 +0.0,0.0,1.0,0.0,0.0 +0.0,0.0,1.0,1.0,1.0 +0.0,1.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0,1.0 +0.0,1.0,1.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0 +1.0,0.0,0.0,1.0,0.0 +1.0,0.0,1.0,0.0,1.0 +1.0,0.0,1.0,1.0,1.0 +1.0,1.0,0.0,0.0,0.0 +1.0,1.0,0.0,1.0,0.0 +1.0,1.0,1.0,0.0,1.0 +1.0,1.0,1.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0 +1.0,0.0,0.0,1.0,0.0 +1.0,0.0,1.0,0.0,1.0 +1.0,0.0,1.0,1.0,1.0 +1.0,1.0,0.0,0.0,0.0 +1.0,1.0,0.0,1.0,0.0 +1.0,1.0,1.0,0.0,1.0 +1.0,1.0,1.0,1.0,1.0 +0.0,0.0,0.0,0.0,0.0 +0.0,0.0,0.0,1.0,0.0 +0.0,0.0,1.0,0.0,0.0 +0.0,0.0,1.0,1.0,1.0 +0.0,1.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0,1.0 +0.0,1.0,1.0,1.0,1.0 +0.0,0.0,0.0,0.0,0.0 +0.0,0.0,0.0,1.0,0.0 +0.0,0.0,1.0,0.0,0.0 +0.0,0.0,1.0,1.0,1.0 +0.0,1.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0,1.0 +0.0,1.0,1.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0 +1.0,0.0,0.0,1.0,0.0 +1.0,0.0,1.0,0.0,1.0 +1.0,0.0,1.0,1.0,1.0 +1.0,1.0,0.0,0.0,0.0 +1.0,1.0,0.0,1.0,0.0 +1.0,1.0,1.0,0.0,1.0 +1.0,1.0,1.0,1.0,1.0 +1.0,0.0,0.0,0.0,0.0 +1.0,0.0,0.0,1.0,0.0 +1.0,0.0,1.0,0.0,1.0 +1.0,0.0,1.0,1.0,1.0 +1.0,1.0,0.0,0.0,0.0 +1.0,1.0,0.0,1.0,0.0 +1.0,1.0,1.0,0.0,1.0 +1.0,1.0,1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_doubleNot/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_doubleNot/PrintPS.csv new file mode 100644 index 0000000000..02b6f192e7 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_doubleNot/PrintPS.csv @@ -0,0 +1,3 @@ +BE1,TOP +0.0,0.0 +1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_iff/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_iff/PrintPS.csv new file mode 100644 index 0000000000..da3529a74d --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_iff/PrintPS.csv @@ -0,0 +1,5 @@ +BE2,BE1,TOP +0.0,0.0,1.0 +1.0,0.0,0.0 +0.0,1.0,0.0 +1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_imply/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_imply/PrintPS.csv new file mode 100644 index 0000000000..590c280f04 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_imply/PrintPS.csv @@ -0,0 +1,5 @@ +BE2,BE1,TOP +0.0,0.0,1.0 +1.0,0.0,1.0 +0.0,1.0,0.0 +1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_multipleFTs/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_multipleFTs/PrintPS.csv new file mode 100644 index 0000000000..e335d22d35 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_multipleFTs/PrintPS.csv @@ -0,0 +1,5 @@ +A,B,TransOne +0.0,0.0,0.0 +0.0,1.0,0.0 +1.0,0.0,0.0 +1.0,1.0,0.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_nand/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_nand/PrintPS.csv new file mode 100644 index 0000000000..6dabd149f5 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_nand/PrintPS.csv @@ -0,0 +1,9 @@ +BE2,BE3,BE1,TOP +0.0,0.0,0.0,1.0 +0.0,1.0,0.0,1.0 +1.0,0.0,0.0,1.0 +1.0,1.0,0.0,1.0 +0.0,0.0,1.0,1.0 +0.0,1.0,1.0,1.0 +1.0,0.0,1.0,1.0 +1.0,1.0,1.0,0.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_nor/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_nor/PrintPS.csv new file mode 100644 index 0000000000..f7556452ce --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_nor/PrintPS.csv @@ -0,0 +1,9 @@ +BE2,BE3,BE1,TOP +0.0,0.0,0.0,1.0 +0.0,1.0,0.0,0.0 +1.0,0.0,0.0,0.0 +1.0,1.0,0.0,0.0 +0.0,0.0,1.0,0.0 +0.0,1.0,1.0,0.0 +1.0,0.0,1.0,0.0 +1.0,1.0,1.0,0.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_not/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_not/PrintPS.csv new file mode 100644 index 0000000000..1a9244c63c --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_not/PrintPS.csv @@ -0,0 +1,3 @@ +BE1,TOP +0.0,1.0 +1.0,0.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_or/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_or/PrintPS.csv new file mode 100644 index 0000000000..3cbc328d95 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_or/PrintPS.csv @@ -0,0 +1,9 @@ +BE2,BE3,BE1,TOP +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0 +1.0,0.0,0.0,1.0 +1.0,1.0,0.0,1.0 +0.0,0.0,1.0,1.0 +0.0,1.0,1.0,1.0 +1.0,0.0,1.0,1.0 +1.0,1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_or_houseEvent/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_or_houseEvent/PrintPS.csv new file mode 100644 index 0000000000..73ef5a6d89 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_or_houseEvent/PrintPS.csv @@ -0,0 +1,5 @@ +BE2,BE1,TOP +0.0,0.0,1.0 +1.0,0.0,1.0 +0.0,1.0,1.0 +1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_xor/PrintPS.csv b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_xor/PrintPS.csv new file mode 100644 index 0000000000..6c36e72105 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/gold/FTimporter_xor/PrintPS.csv @@ -0,0 +1,9 @@ +BE2,BE3,BE1,TOP +0.0,0.0,0.0,0.0 +0.0,1.0,0.0,1.0 +1.0,0.0,0.0,1.0 +1.0,1.0,0.0,0.0 +0.0,0.0,1.0,1.0 +0.0,1.0,1.0,0.0 +1.0,0.0,1.0,0.0 +1.0,1.0,1.0,1.0 diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and.xml new file mode 100644 index 0000000000..79f1fa5773 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_and + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_and + import,printOnFile + 1 + + + + FT_and.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT.xml new file mode 100644 index 0000000000..ebbc16c913 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_and_withNOT + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_and_withNOT + import,printOnFile + 1 + + + + FT_and_NOT.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT_embedded.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT_embedded.xml new file mode 100644 index 0000000000..da28aafb7d --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT_embedded.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_and_withNOT_embedded + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_and_withNOT_embedded + import,printOnFile + 1 + + + + FT_and_NOT_embedded.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT_withNOT_embedded.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT_withNOT_embedded.xml new file mode 100644 index 0000000000..935123c02d --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_and_withNOT_withNOT_embedded.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_and_withNOT_withNOT_embedded + import,printOnFile + 1 + + + + FT_and_withNOT_withNOT_embedded.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_atleast.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_atleast.xml new file mode 100644 index 0000000000..a4f5a406f0 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_atleast.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_atleast + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_atleast + import,printOnFile + 1 + + + + FT_atleast.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3,BE4 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_cardinality.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_cardinality.xml new file mode 100644 index 0000000000..075d0ce9e6 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_cardinality.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_cardinality + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_cardinality + import,printOnFile + 1 + + + + FT_cardinality.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3,BE4,BE5 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_component.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_component.xml new file mode 100644 index 0000000000..9742a52ce4 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_component.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_component + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_component + import,printOnFile + 1 + + + + FT1.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3,BE4 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_doubleNot.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_doubleNot.xml new file mode 100644 index 0000000000..0029bbf41d --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_doubleNot.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_doubleNot + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_doubleNot + import,printOnFile + 1 + + + + FTimporter_doubleNot.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_iff.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_iff.xml new file mode 100644 index 0000000000..bded2e6164 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_iff.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_iff + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_iff + import,printOnFile + 1 + + + + FT_iff.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_imply.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_imply.xml new file mode 100644 index 0000000000..ba805ac6f5 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_imply.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_impy + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_imply + import,printOnFile + 1 + + + + FT_imply.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_multipleFTs.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_multipleFTs.xml new file mode 100644 index 0000000000..ed05385370 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_multipleFTs.xml @@ -0,0 +1,60 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_multipleFTs + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_multipleFTs + import,printOnFile + 1 + + + + trans_model_data.xml + trans_one.xml + trans_two.xml + + + + + OpenPSA + TransOne + + + + + + trans_model_data + trans_one + trans_two + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + A,B + TransOne + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_nand.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_nand.xml new file mode 100644 index 0000000000..bfc79957d2 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_nand.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_nand + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_nand + import,printOnFile + 1 + + + + FT_nand.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_nor.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_nor.xml new file mode 100644 index 0000000000..9d4aba8ae1 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_nor.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_nor + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_nor + import,printOnFile + 1 + + + + FT_nor.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_not.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_not.xml new file mode 100644 index 0000000000..2a0a23f78f --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_not.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_not + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_not + import,printOnFile + 1 + + + + FTimporter_not.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_or.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_or.xml new file mode 100644 index 0000000000..eea02cddfe --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_or.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_or + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_or + import,printOnFile + 1 + + + + FT_or.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_or_houseEvent.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_or_houseEvent.xml new file mode 100644 index 0000000000..2264002b03 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_or_houseEvent.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_or_houseEvent + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_or_houseEvent + import,printOnFile + 1 + + + + FT_or_houseEvent.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_xor.xml b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_xor.xml new file mode 100644 index 0000000000..57b31da43d --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/test_FTimporter_xor.xml @@ -0,0 +1,56 @@ + + + framework/PostProcessors/FTimporterPostProcessor.FTimporter_xor + mandd + 2017-11-07 + ETimporter + + Tests of the FTImporter post-processor: it read a fault-tree (ET) from an .xml file (eventTree.xml) and it imports + the FT structure into a PointSet. Note that the FT needs to be in an OpenPSA format. + + + + + FTimporter_xor + import,printOnFile + 1 + + + + FT_xor.xml + + + + + OpenPSA + TOP + + + + + + faultTreeTest + FTimporter + FT_PS + + + FT_PS + PrintPS + + + + + + csv + FT_PS + + + + + + BE1,BE2,BE3 + TOP + + + + diff --git a/tests/framework/PostProcessors/FTimporterPostProcessor/tests b/tests/framework/PostProcessors/FTimporterPostProcessor/tests new file mode 100644 index 0000000000..e22c0930a5 --- /dev/null +++ b/tests/framework/PostProcessors/FTimporterPostProcessor/tests @@ -0,0 +1,87 @@ +[Tests] + [./FTimporter_and] + type = 'RavenFramework' + input = 'test_FTimporter_and.xml' + csv = 'FTimporter_and/PrintPS.csv' + [../] + [./FTimporter_and_withNOT] + type = 'RavenFramework' + input = 'test_FTimporter_and_withNOT.xml' + csv = 'FTimporter_and_withNOT/PrintPS.csv' + [../] + [./FTimporter_and_withNOT_embedded] + type = 'RavenFramework' + input = 'test_FTimporter_and_withNOT_embedded.xml' + csv = 'FTimporter_and_withNOT_embedded/PrintPS.csv' + [../] + [./FTimporter_and_withNOT_withNOT_embedded] + type = 'RavenFramework' + input = 'test_FTimporter_and_withNOT_withNOT_embedded.xml' + csv = 'FTimporter_and_withNOT_withNOT_embedded/PrintPS.csv' + [../] + [./FTimporter_atleast] + type = 'RavenFramework' + input = 'test_FTimporter_atleast.xml' + csv = 'FTimporter_atleast/PrintPS.csv' + [../] + [./FTimporter_cardinality] + type = 'RavenFramework' + input = 'test_FTimporter_cardinality.xml' + csv = 'FTimporter_cardinality/PrintPS.csv' + [../] + [./FTimporter_component] + type = 'RavenFramework' + input = 'test_FTimporter_component.xml' + csv = 'FTimporter_component/PrintPS.csv' + [../] + [./FTimporter_iff] + type = 'RavenFramework' + input = 'test_FTimporter_iff.xml' + csv = 'FTimporter_iff/PrintPS.csv' + [../] + [./FTimporter_imply] + type = 'RavenFramework' + input = 'test_FTimporter_imply.xml' + csv = 'FTimporter_imply/PrintPS.csv' + [../] + [./FTimporter_multipleFTs] + type = 'RavenFramework' + input = 'test_FTimporter_multipleFTs.xml' + csv = 'FTimporter_multipleFTs/PrintPS.csv' + [../] + [./FTimporter_nand] + type = 'RavenFramework' + input = 'test_FTimporter_nand.xml' + csv = 'FTimporter_nand/PrintPS.csv' + [../] + [./FTimporter_nor] + type = 'RavenFramework' + input = 'test_FTimporter_nor.xml' + csv = 'FTimporter_nor/PrintPS.csv' + [../] + [./FTimporter_or] + type = 'RavenFramework' + input = 'test_FTimporter_or.xml' + csv = 'FTimporter_or/PrintPS.csv' + [../] + [./FTimporter_or_houseEvent] + type = 'RavenFramework' + input = 'test_FTimporter_or_houseEvent.xml' + csv = 'FTimporter_or_houseEvent/PrintPS.csv' + [../] + [./FTimporter_xor] + type = 'RavenFramework' + input = 'test_FTimporter_xor.xml' + csv = 'FTimporter_xor/PrintPS.csv' + [../] + [./FTimporter_not] + type = 'RavenFramework' + input = 'test_FTimporter_not.xml' + csv = 'FTimporter_not/PrintPS.csv' + [../] + [./FTimporter_doubleNot] + type = 'RavenFramework' + input = 'test_FTimporter_doubleNot.xml' + csv = 'FTimporter_doubleNot/PrintPS.csv' + [../] +[]