Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement file and database IO functions in OpenMCDepcode #189

Merged
merged 25 commits into from
Mar 7, 2023
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
0c73c42
Implement OpenMCDepcode.read_depleted_materials()
yardasol Feb 1, 2023
6d62da9
implement OpenMCDepcode.update_depletable_materials
yardasol Feb 2, 2023
d9b8501
implement read_neutronics_parameters and read_step_metadata in
yardasol Feb 2, 2023
1dc2eb3
fix runtime bugs for OpenMC coupled simulations
yardasol Feb 9, 2023
d2c6957
runtime bugfixes
yardasol Feb 13, 2023
b52b618
fix test suite
yardasol Feb 14, 2023
7b5920b
update release notes
yardasol Feb 14, 2023
bf25d28
fix CI
yardasol Feb 15, 2023
780e39c
add some machinery to enable output consistency when restarting faile…
yardasol Feb 22, 2023
ff87b5f
Apply suggestions from @samgdotson's code review
yardasol Feb 22, 2023
0379014
split up helper function
yardasol Feb 22, 2023
8c6fc84
Merge branch 'openmc-file-io' of github.com:yardasol/saltproc into op…
yardasol Feb 22, 2023
0e00b14
fix duplicate id warnings; bugfix in _fix_nuclide_discrepancy
yardasol Feb 23, 2023
3abb8d4
Apply suggestions from code review
yardasol Feb 23, 2023
34f7d75
fix docstring in Depcode.run_depletion_step()
yardasol Feb 23, 2023
1765a7c
Merge branch 'openmc-file-io' of github.com:yardasol/saltproc into op…
yardasol Feb 23, 2023
164ed20
fix duplicate id warnings
yardasol Feb 23, 2023
2c16079
typo fix
yardasol Feb 23, 2023
bf8a44b
@LukeSefiert suggestions for openmc_depcode.py
yardasol Feb 27, 2023
d5b7491
update batch and particle settings for convergence
yardasol Feb 27, 2023
12cfdb9
Apply suggestions from code review
yardasol Feb 27, 2023
3749522
typo fixes; fix tests; update msbr example
yardasol Feb 28, 2023
b4ce1f3
fix run_depletion_step
yardasol Feb 28, 2023
652a34e
docstring fixes
yardasol Mar 2, 2023
31cbef5
fix constant reprocessing integration test
yardasol Mar 6, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 7 additions & 3 deletions .github/workflows/test-saltproc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ on:
workflow_dispatch:

env:
CACHE_NUMBER: 0 #change to manually reset cache
CACHE_NUMBER: 6 #change to manually reset cache
LukeSeifert marked this conversation as resolved.
Show resolved Hide resolved

jobs:
test-saltproc:
Expand Down Expand Up @@ -57,6 +57,7 @@ jobs:
path: |
/usr/share/miniconda3/envs/saltproc-env
~/openmc_src
~/mcpl_src
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is mcpl? I'm not familiar with that, assuming that the directory name is for a code source.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not entirely sure, but it appears to be some sort of file format. I added MCPL to SaltProc's CI in #182, but I forgot to add full caching support, which I am doing here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok I just hadn't heard of it before and I was curious.

~/endfb71_hdf5
~/.cache/pip
key: depcache-${{ hashFiles('environment.yml') }}-${{ env.DATE }}-${{ env.CACHE_NUMBER }}
Expand Down Expand Up @@ -94,7 +95,10 @@ jobs:
- name: Install SaltProc
run: pip install .

- name: Test SaltProc
- name: Environment variables
run: |
echo "OPENMC_CROSS_SECTIONS=$HOME/endfb71_hdf5/cross_sections.xml" >> $GITHUB_ENV
pytest --ignore tests/integration_tests/run_no_reprocessing --ignore tests/integration_tests/run_constant_reprocessing tests/

- name: Test SaltProc
run: |
pytest --ignore tests/integration_tests/run_no_reprocessing_serpent --ignore tests/integration_tests/run_no_reprocessing_openmc --ignore tests/integration_tests/run_constant_reprocessing_serpent --ignore tests/integration_tests/run_constant_reprocessing_openmc tests/
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are you ignoring all of these integration tests?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The serpent ones require export controlled software, which I don't even know how I would get onto a GH actions runner, and the OpenMC ones take a long time to run, too long for CI in my opinion.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok. I get not including the serpent tests. Is there a way to speed up the OpenMC tests? Also, what is even being tested in the CI if you include tests/ in the list of directories to ignore?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I second Amanda's question.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a way to speed up the OpenMC tests?

I haven't been able to find a way to do this.

Also, what is even being tested in the CI if you include tests/ in the list of directories to ignore?

tests/ is not being ignored, only the directories that have the --ignore flag are being ignored. You can look at the Checks tab to verify this.

5 changes: 5 additions & 0 deletions doc/releasenotes/v0.5.0.rst
Original file line number Diff line number Diff line change
Expand Up @@ -173,11 +173,15 @@ Python API Changes
- (new function) → ``get_neutron_settings()``
- (new function) → ``_get_burnable_materials_file()``
- (new function) → ``_get_burnable_material_card_data()``
- (new function) → ``resolve_include_paths()``
- (new function) → ``_convert_name_to_nuccode()``
- (new parameter) → ``zaid_convention``


- ``OpenMCDepcode`` is a ``Depcode`` subclass that interfaces with ``openmc``. This class implements the following functions
- ``run_depletion_step()``
- ``write_saltproc_openmc_tallies()``
- ``convert_nuclide_code_to_name()``
- ``switch_to_next_geometry()``
- ``write_runtime_input()``
- ``write_depletion_settings()``
Expand Down Expand Up @@ -206,6 +210,7 @@ Python API Changes

- ``core_number`` → (removed)
- ``node_number`` → (removed)
- (new function) → ``_add_missing_nuclides()``

- ``Sparger``

Expand Down
2 changes: 1 addition & 1 deletion examples/msbr/msbr_endfb71.serpent
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ set sfylib "endfb71.sfy"

% --- Neutron population and criticality cycles:

set pop 300 400 10
set pop 10000 40 10
LukeSeifert marked this conversation as resolved.
Show resolved Hide resolved
set gcu -1
%set usym 0 3 2 0.0 0.0 0 90

Expand Down
6 changes: 2 additions & 4 deletions examples/msbr/msbr_endfb71_main.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,15 @@
"proc_input_file": "msbr_objects.json",
"dot_input_file": "msbr.dot",
"n_depletion_steps": 12,
"output_path": "serpent_msbr_test",
"depcode": {
"codename": "serpent",
"exec_path": "sss2",
"template_input_file_path": "./msbr_endfb71.serpent",
"geo_file_paths": ["./geometry/msbr_full.ini"]
},
"simulation": {
"sim_name": "msbr_example_simulation",
"db_name": "msbr_kl_100_saltproc.h5",
"restart_flag": false,
"adjust_geo": false
"sim_name": "msbr_serpent_test"
},
"reactor": {
"volume": 1.0,
Expand Down
20 changes: 14 additions & 6 deletions examples/msbr/openmc_msbr_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,15 +51,22 @@ def parse_arguments():
-------
deplete : bool
Flag indicated whether or not to run a depletion simulation.
volume : bool
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
volume : bool
volume_calculation: bool

This just feels like a more descriptive variable name

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seconded, though I also agree with later comments that volume_calculation is very close to a function name, and something like stoch_volume or stoch_vol might work.

Flag indicating whether or not to run a stochastic volume calcuation.
"""
parser = argparse.ArgumentParser()
parser.add_argument('--deplete',
type=bool,
default=False,
help='flag for running depletion')
parser.add_argument('--volume',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
parser.add_argument('--volume',
parser.add_argument('--volume_calculation',

If you change it earlier the code

type=bool,
default=False,
help='flag for running stochastic volume calculation')


args = parser.parse_args()
return bool(args.deplete)
return bool(args.deplete), bool(args.volume)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above


def shared_elem_geometry(elem_type='core',
gr_sq_d=4.953,
Expand Down Expand Up @@ -372,7 +379,7 @@ def plot_geometry(name,
return plot


deplete = parse_arguments()
deplete, volume = parse_arguments()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above


(zone_bounds,
core_bounds,
Expand Down Expand Up @@ -458,10 +465,11 @@ def plot_geometry(name,
'range': (800, 1000)}

ll, ur = geo.root_universe.bounding_box
msbr_volume_calc = openmc.VolumeCalculation([fuel, moder], 1000000000, ll, ur)
#msbr_volume_calc.set_trigger(1e-03, 'rel_err')
settings.volume_calculations = [msbr_volume_calc]
settings.run_mode = 'volume'
if volume:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

msbr_volume_calc = openmc.VolumeCalculation([fuel, moder], int(1e10), ll, ur)
#msbr_volume_calc.set_trigger(1e-03, 'rel_err')
settings.volume_calculations = [msbr_volume_calc]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, maybe my suggestion wasn't that great because it's pretty similar to this one.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe something like stoch_volume?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or stoch_vol, if you're worried about length?

settings.run_mode = 'volume'
settings.export_to_xml()

## Slice plots
Expand Down
100 changes: 95 additions & 5 deletions saltproc/abc.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
import sys
import subprocess

from abc import ABC, abstractmethod

class Depcode(ABC):
Expand Down Expand Up @@ -99,21 +102,37 @@ def read_depleted_materials(self, read_at_end=False):
:class:`Materialflow` object holding material composition and properties.

"""

@abstractmethod
def run_depletion_step(self, mpi_args, threads):
def run_depletion_step(self, mpi_args, args):
"""Runs a depletion step as a subprocess with the given parameters.

Parameters
----------
mpi_args : list of str
Arguments for running simulations on supercomputers using
``mpiexec`` or similar programs.
threads : int
Threads to use for shared-memory parallelism
args : int
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you changed the name of threads to args, shouldn't the docstring also change? Otherwise, I'm not sure why this needed to be args rather than threads.

Arguments for running depletion step.

"""

print('Running %s' % (self.codename))
try:
if mpi_args is None:
stdout = sys.stdout
else:
stdout = None
subprocess.run(
args,
check=True,
cwd=self.output_path,
stdout=stdout,
stderr=subprocess.STDOUT)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may cause issues on windows machines, FYI.
See related watts issue.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be okay if you're not using PIPE.

print(f'Finished {self.codename.upper()} Run')
except subprocess.CalledProcessError as error:
print(error.output.decode("utf-8"))
raise RuntimeError('\n %s RUN FAILED\n see error message above'
% (self.codename))

@abstractmethod
def switch_to_next_geometry(self):
"""Changes the geometry used in the depletion code simulation to the
Expand Down Expand Up @@ -153,5 +172,76 @@ def update_depletable_materials(self, mats, dep_end_time):

"""

def read_plaintext_file(self, file_path):
"""Reads the content of a plaintext file for use by other methods.

Parameters
----------
file_path : str
Path to file.

Returns
-------
file_lines : list of str
File lines.

"""
file_lines = []
with open(file_path, 'r') as file:
file_lines = file.readlines()
return file_lines

@abstractmethod
def convert_nuclide_code_to_name(self, nuc_code):
"""Converts depcode nuclide code to symbolic nuclide name.

Parameters
----------
nuc_code : str
Nuclide code

Returns
-------
nuc_name : str
Symbolic nuclide name (`Am242m1`).

"""

Comment on lines +195 to +209
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not very familiar with abc, what does this do?

@abstractmethod
def _convert_name_to_nuccode(self, nucname):
"""Converts depcode nuclide name to ZA nuclide code

Parameters
----------
nucname : str
Nuclide namce

Returns
-------
nuc_name : str
ZA nuclide code

"""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are _convert_name_to_nuccode() and convert_nuclide_code_to_name() placeholders for now?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, but I'd like to consolidate many of these nuclide code to nuclide name conversion functions. See issue #191


def preserve_simulation_files(self, step_idx):
"""Move simulation input and output files
to unique a directory

Parameters
----------
step_idx : int

"""
step_results_dir = self.output_path / f'step_{step_idx}_data'
step_results_dir.mkdir(exist_ok=True)

file_path = lambda file : self.output_path / file
output_paths = list(map(file_path, self._OUTPUTFILE_NAMES))
input_paths = list(map(file_path, self._INPUTFILE_NAMES))
for file_path, fname in zip(output_paths, self._OUTPUTFILE_NAMES):
file_path.rename(step_results_dir / name)

for file_path, fname in zip(input_paths, self._INPUTFILE_NAMES):
lines = self.read_plaintext_file(file_path)
with open(step_results_dir / fname, 'w') as out_file:
out_file.writelines(lines)
43 changes: 32 additions & 11 deletions saltproc/app.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import os
from pathlib import Path
from copy import deepcopy

Expand Down Expand Up @@ -27,6 +28,11 @@
DAY_UNITS = ('d', 'day')
YEAR_UNITS = ('a', 'year', 'yr')

_SECONDS_PER_DAY = 60 * 60 * 24
_MINUTES_PER_DAY = 60 * 24
_HOURS_PER_DAY = 24
_DAYS_PER_YEAR = 365.25
yardasol marked this conversation as resolved.
Show resolved Hide resolved

def run():
""" Inititializes main run"""
threads, saltproc_input = parse_arguments()
Expand All @@ -40,10 +46,10 @@ def run():
msr = _create_reactor_object(object_input[2])

# Check: Restarting previous simulation or starting new?
simulation.check_restart()
failed_step = simulation.check_restart()
# Run sequence
# Start sequence
for step_idx in range(len(msr.depletion_timesteps)):
for step_idx in range(failed_step, len(msr.depletion_timesteps)):
print("\n\n\nStep #%i has been started" % (step_idx + 1))
simulation.sim_depcode.write_runtime_input(msr,
step_idx,
Expand All @@ -60,6 +66,7 @@ def run():
mats = depcode.read_depleted_materials(True)
simulation.store_mat_data(mats, step_idx, False)
simulation.store_run_step_info()

# Reprocessing here
print("\nMass and volume of fuel before reproc: %f g, %f cm3" %
(mats['fuel'].mass,
Expand Down Expand Up @@ -90,6 +97,10 @@ def run():
# Store in DB after reprocessing and refill (right before next depl)
simulation.store_after_repr(mats, waste_and_feed_streams, step_idx)
depcode.update_depletable_materials(mats, simulation.burn_time)

# Preserve depletion and transport result and input files
depcode.preserve_simulation_files(step_idx)

del mats, waste_streams, waste_and_feed_streams, extracted_mass
gc.collect()
# Switch to another geometry?
Expand Down Expand Up @@ -175,7 +186,8 @@ def read_main_input(main_inp_file):
traceback.print_exc()

# Global input path
input_path = (Path.cwd() / Path(f.name).parents[0])
input_path = (Path.cwd() / Path(f.name).parents[0]).resolve()
os.chdir(input_path)

# Saltproc settings
process_file = str((input_path /
Expand Down Expand Up @@ -213,6 +225,16 @@ def read_main_input(main_inp_file):
depcode_input['chain_file_path'] = \
str((input_path /
depcode_input['chain_file_path']).resolve())

# process depletion_settings
depletion_settings = depcode_input['depletion_settings']
operator_kwargs = depletion_settings['operator_kwargs']
if operator_kwargs != {}:
fission_q_path = operator_kwargs['fission_q']
if fission_q_path is not None:
operator_kwargs['fission_q'] = str(input_path / fission_q_path)
depletion_settings['operator_kwargs'] = operator_kwargs
depcode_input['depletion_settings'] = depletion_settings
else:
raise ValueError(f'{codename} is not a supported depletion code.'
' Accepts: "serpent" or "openmc".')
Expand Down Expand Up @@ -244,7 +266,7 @@ def _print_simulation_input_info(simulation_input, depcode_input):
str(simulation_input['restart_flag']) +
'\n'
'\tTemplate File Path = ' +
depcode_input['template_input_file_path'] +
str(depcode_input['template_input_file_path']) +
'\n'
'\tOutput HDF5 database Path = ' +
simulation_input['db_name'] +
Expand Down Expand Up @@ -303,8 +325,8 @@ def _process_main_input_reactor_params(reactor_input,
depletion_timesteps,
codename)

reactor_input['depletion_timesteps'] = list(depletion_timesteps)
reactor_input['power_levels'] = list(power_levels)
reactor_input['depletion_timesteps'] = depletion_timesteps.tolist()
reactor_input['power_levels'] = power_levels.tolist()

return reactor_input

Expand Down Expand Up @@ -344,19 +366,18 @@ def _scale_depletion_timesteps(timestep_units, depletion_timesteps, codename):
# serpent base timestep units are days or mwd/kg
if not(timestep_units in DAY_UNITS) and timestep_units.lower() != 'mwd/kg' and codename == 'serpent':
if timestep_units in SECOND_UNITS:
depletion_timesteps /= 60 * 60 * 24
depletion_timesteps /= _SECONDS_PER_DAY
elif timestep_units in MINUTE_UNITS:
depletion_timesteps /= 60 * 24
depletion_timesteps /= _MINUTES_PER_DAY
elif timestep_units in HOUR_UNITS:
depletion_timesteps /= 24
depletion_timesteps /= _HOURS_PER_DAY
elif timestep_units in YEAR_UNITS:
depletion_timesteps *= 365.25
depletion_timesteps *= _DAYS_PER_YEAR
else:
raise IOError(f'Unrecognized time unit: {timestep_units}')

return depletion_timesteps


def reprocess_materials(mats, process_file, dot_file):
"""Applies extraction reprocessing scheme to burnable materials.

Expand Down
Loading