This document describes how to set up a development environment to modify, build and test MODFLOW 6. Details on how to contribute your code to the repository are found in the separate document CONTRIBUTING.md.
To build and test a parallel version of the program, first read the instructions below and then continue in PARALLEL.md.
Before you can build and test MODFLOW 6, you must install and configure the following on your development machine:
- git
- Python3.8+
- a modern Fortran compiler
Some additional, optional tools are also discussed below.
Git and/or the GitHub app (for Mac or Windows). GitHub's Guide to Installing Git is a good source of information.
The GNU Fortran compiler gfortran
or the Intel Fortran Classic compiler ifort
can be used to compile MODFLOW 6.
Note: the next-generation Intel Fortran compiler ifx
is not yet compatible with MODFLOW 6.
GNU Fortran can be installed on all three major platforms.
- fedora-based:
dnf install gcc-gfortran
- debian-based:
apt install gfortran
- Download the Minimalist GNU for Windows (MinGW) installer from Source Forge: https://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Win32/Personal%20Builds/mingw-builds/installer/mingw-w64-install.exe
- Run the installer. Make sure to change
Architecture
tox86_64
. Leave the other settings on default. - Find the
mingw64/bin
directory in the installation and add it to your PATH. FindEdit the system environment variables
in your Windows Start Screen. Click theEnvironmental Variables
button and double-click thePath
variable in the User Variables (the top table). Click theNew
button and enter the location of themingw64/bin
directory.
Intel Fortran can also be used to compile MODFLOW 6 and associated utilities. The ifort
and ifx
compilers are available in the Intel oneAPI HPC Toolkit.
A number of environment variables must be set before using Intel Fortran. General information can be found here, with specific instructions to configure a shell session for ifort
here.
While the current development version of MODFLOW 6 is broadly compatible with ifort
, ifx
compatibility is still limited. The following table documents whether MODFLOW 6 can be succesfully built with particular platform/compiler combinations, and info on relevant errors if not.
Note: this table is not exhaustive and only details the currently tested subset of combinations.
Platform | Compiler | Version | Compatible? | Notes |
---|---|---|---|---|
Ubuntu 22.04 | ifort | 2021.[6-10] | ✓ | |
Ubuntu 22.04 | ifx | 2022.2.[0-1] | ✓ | some autotests fail (convergence failure, bad head comparison) |
Ubuntu 22.04 | ifx | 2022.1 | ✗ | segfault in meson serial simulation test |
Ubuntu 22.04 | ifx | 2022.0, 2021.[1,2,4] | ✗ | compilation failure (segfault) |
macOS 12 (Monterey) | ifort | 2021.[6-10] | ✓ | |
macOS N | ifx | all | ✗ | ifx support for macOS is not planned |
Windows 10 (Server 2022) | ifort | 2021.[6-10] | ✓ | |
Windows 10 (Server 2022) | ifx | 2023.[0-1] | ✗ | compilation failure |
Windows 10 (Server 2022) | ifx | 2022.2 | ✓ | |
Windows 10 (Server 2022) | ifx | 2022.1 | ✗ | linking failure |
On Windows, Visual Studio and a number of libraries must be installed for ifort
to work. The required libraries can be installed by ticking the "Desktop Development with C++" checkbox in the Visual Studio Installer's Workloads tab.
Note: Invoking the setvars.bat
scripts from a Powershell session will not put ifort
on the path, since batch script environments are local to their process. To relaunch PowerShell with oneAPI variables configured:
cmd.exe "/K" '"C:\Program Files (x86)\Intel\oneAPI\setvars-vcvarsall.bat" && "C:\Program Files (x86)\Intel\oneAPI\compiler\latest\env\vars.bat" && powershell'
Python 3.8+ is required to run MODFLOW 6 tests. A Conda distribution (e.g. miniconda or Anaconda is recommended. Python dependencies are specified in environment.yml
. To create an environment, run from the project root:
conda env create -f environment.yml
To update an existing environment:
conda env update -f environment.yml
This project depends critically on a few Python packages for building, linting and testing tasks:
meson
fprettify
pymake
flopy
These are each described briefly below. The Conda environment.yml
contains a number of other dependencies also required for various development tasks, but they are not described in detail here.
Meson is recommended for building MODFLOW 6 and is included in environment.yml
. It can also be installed independently — note that if you do so you will need to manually add the executable to the PATH.
fprettify
can be used to format Fortran source code and in combination with the MODFLOW 6 fprettify configuration establishes a contribution standard for properly formatted MODFLOW 6 Fortran source. This tool can be installed with pip
or conda
and used from the command line or integrated with a VSCode or Visual Studio development environment. The fprettify
package is included in the Conda environment in environment.yml
. See contribution guidelines for additional information.
The mfpymake
package can build MODFLOW 6 and related programs and artifacts (e.g. makefiles), and is used in particular by the distribution/build_makefiles.py
script. mfpymake
is included in the Conda environment in environment.yml
. To install separately, follow the instructions as explained on the README of the repository. The README also demonstrates basic usage.
flopy
is used throughout MODFLOW 6 tests to create, run and post-process models.
Like MODFLOW 6, flopy
is modular — for each MODFLOW 6 package there is generally a corresponding flopy
plugin. Plugins are generated dynamically from DFN files stored in this repository under doc/mf6io/mf6ivar/dfn
.
The tests use a set of shared fixtures and utilities provided by the modflow-devtools
package. This package is included in the Conda environment in environment.yml
.
Some other tools are useful but not required to develop MODFLOW 6.
This repository provides makefiles, generated by mfpymake
, which can be used to build MODFLOW 6 with GNU Make. For further instructions we refer to the GNU Make Manual.
Visual Studio installers can be downloaded from the official website. MODFLOW 6 solution files can be found in the msvs
folder.
Doxygen is used to generate the MODFLOW 6 source code documentation. Graphviz is used by doxygen to produce source code diagrams. LaTeX is used to generate the MODFLOW 6 release notes and Input/Output documents (docs/mf6io/mf6io.nightlybuild).
These programs can be installed from various sources, including by conda, macports, or from individual sources such as https://www.tug.org/. Details about USGS LaTeX libraries can be seen in addition to linux installs in the CI workflow for the docs (.github/workflows/ci-docs.yml
).
Fork and clone the MODFLOW 6 repository:
- Login to your GitHub account or create one by following the instructions given here.
- Fork the main MODFLOW 6.
- Clone your fork of the MODFLOW 6 repository and create an
upstream
remote pointing back to your fork.
# Clone your GitHub repository:
git clone git@github.com:<github username>/modflow6.git
# Go to the MODFLOW 6 directory:
cd modflow6
# Add the main MODFLOW 6 repository as an upstream remote to your repository:
git remote add upstream https://github.com/MODFLOW-USGS/modflow6.git
Meson is the recommended build tool for MODFLOW 6. Meson must be installed and on your PATH. Creating and activating the Conda environment environment.yml
should be sufficient for this.
Meson build configuration files are provided for MODFLOW 6 as well as zbud6
and mf5to6
utility programs:
meson.build
utils/zonebudget/meson.build
utils/mf5to6/meson.build
To build MODFLOW 6, first configure the build directory. By default Meson uses compiler flags for a release build. To create a debug build, add -Doptimization=0
to the following setup
command.
# bash (linux and macOS)
meson setup builddir --prefix=$(pwd) --libdir=bin
# cmd (windows)
meson setup builddir --prefix=%CD% --libdir=bin
Compile MODFLOW 6 by executing:
meson compile -C builddir
In order to run the tests the binaries have to be installed:
meson install -C builddir
The binaries can then be found in the bin
folder. meson install
also triggers a compilation if necessary, so executing meson install
is enough to get up-to-date binaries in the bin
folder.
Note: If using Visual Studio Code, you can use tasks as described here to automate the above.
MODFLOW 6 tests are driven with pytest
, with the help of plugins like pytest-xdist
and pytest-cases
. Testing dependencies are included in the Conda environment environment.yml
.
Note: the entire test suite should pass before a pull request is submitted. Tests run in GitHub Actions CI and a PR can only be merged with passing tests. See CONTRIBUTING.md
for more information.
A few tasks must be completed before running tests:
- build local MODFLOW 6 development version
- rebuild the last MODFLOW 6 release
- install additional executables
- update FloPy packages and plugins
- clone MODFLOW 6 test model and example repositories
Tests expect binaries to live in the bin
directory relative to the project root, as configured above in the meson
commands. Binaries are organized as follows:
- local development binaries in the top-level
bin
folder - binaries rebuilt in development mode from the latest release in
bin/rebuilt
- related programs installed from the executables distribution live in
bin/downloaded
Tests must be run from the autotest
folder.
Before running tests, the local development version of MODFLOW 6 must be built with meson
as described above. The autotest/build_exes.py
script is provided as a shortcut to easily rebuild local binaries. The script can be run from the project root with:
python autotest/build_exes.py
Alternatively, it can be run from the autotest
directory with pytest
:
pytest build_exes.py
By default, binaries will be placed in the bin
directory relative to the project root, as in the meson
commands described above. To change the location of the binaries, use the --path
option.
Tests require the latest official MODFLOW 6 release to be compiled in develop mode with the same Fortran compiler as the development version. A number of binaries distributed from the executables repo must also be installed. The script autotest/get_exes.py
does both of these things. It can be run from the project root with:
python autotest/get_exes.py
Alternatively, with pytest
from the autotest
directory:
pytest get_exes.py
By default, binaries will be placed in the bin
directory relative to the project root, as in the meson
commands described above. Nested bin/downloaded
and bin/rebuilt
directories are created to contain the rebuilt last release and the downloaded executables, respectively. To change the location of the binaries, use the --path
option.
Plugins should be regenerated from DFN files before running tests for the first time or after definition files change. This can be done with the autotest/update_flopy.py
script, which wipes and regenerates plugin classes for the flopy
installed in the Python environment.
Note: if you've installed a local version of flopy
from source, running this script can overwrite files in your repository.
There is a single optional argument, the path to the folder containing definition files. By default DFN files are assumed to live in doc/mf6io/mf6ivar/dfn
, making the following identical:
python autotest/update_flopy.py
python autotest/update_flopy.py doc/mf6io/mf6ivar/dfn
Some autotests load example models from external repositories:
MODFLOW-USGS/modflow6-testmodels
MODFLOW-USGS/modflow6-largetestmodels
MODFLOW-USGS/modflow6-examples
By default, the tests expect these repositories side-by-side with (i.e. in the same parent directory as) the modflow6
repository. If the repos are somewhere else, you can set the REPOS_PATH
environment variable to point to their parent directory. If external model repositories are not found, tests requiring them will be skipped.
Note: a convenient way to persist environment variables needed for tests is to store them in a .env
file in the autotest
folder. Each variable should be defined on a separate line, with format KEY=VALUE
. The pytest-dotenv
plugin will then automatically load any variables found in this file into the test process' environment.
The test model repos can simply be cloned — ideally, into the parent directory of the modflow6
repository, so that repositories live side-by-side:
git clone MODFLOW-USGS/modflow6-testmodels
git clone MODFLOW-USGS/modflow6-largetestmodels
First clone the example models repo:
git clone MODFLOW-USGS/modflow6-examples
The example models require some setup after cloning. Some extra Python dependencies are required to build the examples:
cd modflow6-examples/etc
pip install -r requirements.pip.txt
Then, still from the etc
folder, run:
python ci_build_files.py
This will build the examples for subsequent use by the tests.
Tests are driven by pytest
and must be run from the autotest
folder. To run tests in a particular file, showing verbose output, use:
pytest -v <file>
Tests can be run in parallel with the -n
option, which accepts an integer argument for the number of parallel processes. If the value auto
is provided, pytest-xdist
will use one worker per available processor.
pytest -v -n auto
Markers can be used to select subsets of tests. Markers provided in pytest.ini
include:
slow
: tests that take longer than a few seconds to completerepo
: tests that require external model repositorieslarge
: tests using large models (from themodflow6-examples
andmodflow6-largetestmodels
repos)regression
: tests comparing results from multiple versions
Markers can be used with the -m <marker>
option, and can be applied in boolean combinations with and
, or
and not
. For instance, to run fast tests in parallel, excluding regression tests:
pytest -v -n auto -m "not slow and not regression"
The --smoke
(short -S
) flag, provided by modflow-devtools
is an alias for the above:
pytest -v -n auto -S
Smoke testing is a form of integration testing which aims to test a decent fraction of the codebase quickly enough to run often during development.
Tests using models from external repositories can be selected with the repo
marker:
pytest -v -n auto -m "repo"
The large
marker is a subset of the repo
marker. To test models excluded from commit-triggered CI and only run on GitHub Actions nightly:
pytest -v -n auto -m "large"
Test scripts for external model repositories can also be run independently:
# MODFLOW 6 test models
pytest -v -n auto test_z01_testmodels_mf6.py
# MODFLOW 5 to 6 conversion test models
pytest -v -n auto test_z02_testmodels_mf5to6.py
# models from modflow6-examples repo
pytest -v -n auto test_z03_examples.py
# models from modflow6-largetestmodels repo
pytest -v -n auto test_z03_largetestmodels.py
Tests load external models from fixtures provided by modflow-devtools
. External model tests can be selected by model or simulation name, or by packages used. See the modflow-devtools
documentation for usage examples. Note that filtering options only apply to tests using external models, and will not filter tests defining models in code — for that, the pytest
built-in -k
option may be used.
Tests should ideally follow a few conventions for easier maintenance:
-
Use temporary directory fixtures. Tests which write to disk should use
pytest
's built-intmp_path
fixtures or one of the keepable temporary directory fixtures frommodflow-devtools
. This prevents tests from polluting one another's state. -
Use markers for convenient (de-)selection:
@pytest.mark.slow
if the test doesn't complete in a few seconds (this preserves the ability to quickly--smoke
test@pytest.mark.repo
if the test relies on external model repositories@pytest.mark.regression
if the test compares results from different versions
Note: If all three external model repositories are not installed as described above, some tests will be skipped. The full test suite includes >750 cases. All must pass before changes can be merged into this repository.
Run build_makefiles.py
in the distribution/
directory after adding, removing, or renaming source files. This script uses Pymake to regenerate makefiles. For instance:
python build_makefiles.py
If the utilities located in the utils
directory (e.g., mf5to6
and zbud6
) are affected by changes to the modflow6 src/
directory (such as new or refactored source files), then the new module source file should also be added to the utility's utils/<util>/pymake/extrafiles.txt
file. This file informs Pymake of source files living outside the main source directory, so they can be included in generated makefiles.
Module dependencies for features still under development should be added to excludefiles.txt
. Source files listed in this file will be excluded from makefiles generated by Pymake. Makefiles should only include the source files needed to the build officially released/supported features.
Makefile generation and usage can be tested from the distribution
directory by running the build_makefiles.py
script with Pytest:
pytest -v build_makefiles.py
Note: make
is required to test compiling MODFLOW 6 with makefiles. If make
is not discovered on the system path, compile tests will be skipped.
Makefiles may also be tested manually by changing to the appropriate make
subdirectory (of the project root for MODFLOW 6, or inside the corresponding utils
subdirectory for the zonebudget or converter utilities) and invoking make
(make clean
may first be necessary to remove previously created object files).
On Windows, it is recommended to generate and test makefiles from a Unix-like shell rather than PowerShell or Command Prompt. Make can be installed via Conda or Chocolatey. Alternatively, it is included with mingw, which is also available from Chocolatey.
To use Conda from Git Bash on Windows, first run the conda.sh
script located in your Conda installation's /etc/profile.d
subdirectory. For instance, with Anaconda3:
. /c/Anaconda3/etc/profile.d/conda.sh
Or Miniconda3:
. /c/ProgramData/miniconda3/etc/profile.d/conda.sh
After this, conda
commands should be available.
This command may be added to a .bashrc
or .bash_profile
file in your home directory to permanently configure Git Bash for Conda.
This section documents MODFLOW 6 branching strategy and other VCS-related procedures.
This project follows the git flow: development occurs on the develop
branch, while master
is reserved for the state of the latest release. Development PRs are typically squashed to develop
to avoid merge commits. At release time, release branches are merged to master
, and then master
is merged back into develop
.
When a feature branch takes a long time to develop, it is easy to become out of sync with the develop branch. Depending on the situation, it may be advisable to periodically squash the commits on the feature branch and rebase the change set with develop. The following approach for updating a long-lived feature branch has proven robust.
In the example below, the feature branch is assumed to be called feat-xyz
.
Begin by creating a backup copy of the feature branch in case anything goes terribly wrong.
git checkout feat-xyz
git checkout -b feat-xyz-backup
git checkout feat-xyz
Next, consider squashing commits on the feature branch. If there are many commits, it is beneficial to squash them before trying to rebase with develop. There is a nice article on squashing commits into one using git, which has been very useful for consolidating commits on a long-lived modflow6 feature branch.
A quick and dirty way to squash without interactive rebase (as an alternative to the approach described in the article mentioned in the preceding paragraph) is a soft reset followed by an ammended commit. First making a backup of the feature branch is strongly recommended before using this approach, as accidentally typing --hard
instead of --soft
will wipe out all your work.
git reset --soft <first new commit on the feature branch>
git commit --amend -m "consolidated commit message"
Once the commits on the feature branch have been consolidated, a force push to origin is recommended. This is not strictly required, but it can serve as an intermediate backup/checkpoint so the squashed branch state can be retrieved if rebasing fails. The following command will push feat-xyz
to origin.
git push origin feat-xyz --force
The --force
flag's short form is -f
.
Now that the commits on feat-xyz
have been consolidated, it is time to rebase with develop. If there are multiple commits in feat-xyz
that make changes, undo them, rename files, and/or move things around in subsequent commits, then there may be multiple sets of merge conflicts that will need to be resolved as the rebase works its way through the commit change sets. This is why it is beneficial to squash the feature commits before rebasing with develop.
To rebase with develop, make sure the feature branch is checked out and then type:
git rebase develop
If anything goes wrong during a rebase, there is the rebase --abort
command to unwind it.
If there are merge conflicts, they will need to be resolved before going forward. Once any conflicts are resolved, it may be worthwhile to rebuild the MODFLOW 6 program and run the smoke tests to ensure nothing is broken.
At this point, you will want to force push the updated feature branch to origin using the same force push command as before.
git push origin feat-xyz --force
Lastly, if you are satisfied with the results and confident the procedure went well, then you can delete the backup that you created at the start.
git branch -d feat-xyz-backup
This process can be repeated periodically to stay in sync with the develop branch and keep a clean commit history.
To deprecate a MODFLOW 6 input/output option in a DFN file:
- Add a new
deprecated x.y.z
attribute to the appropriate variable in the package DFN file, wherex.y.z
is the version the deprecation is introduced. Mention the deprecation prominently in the release notes. - If support for the deprecated option is removed (typically after at least 2 minor or major releases or 1 year), add a new
removed x.y.z
attribute to the variable in the DFN file, wherex.y.z
is the version in which support for the option was removed. The line containingdeprecated x.y.z
should not be deleted. Mention the removal prominently in the release notes. - Deprecated/removed attributes are not removed from DFN files but remain in perpetuity. The
doc/mf6io/mf6ivar/deprecations.py
script generates a markdown deprecation table which is converted to LaTeX bydoc/ReleaseNotes/mk_deprecations.py
for inclusion in the MODFLOW 6 release notes. Deprecations and removals should still be mentioned separately in the release notes, however.
To search for deprecations and removals in DFN files on a system with git
and standard Unix commands available:
git grep 'deprecated' -- '*.dfn' | awk '/^*.dfn:deprecated/'