Skip to content

DOI-USGS/neic-finitefault

Repository files navigation

Wavelet and simulated Annealing SliP inversion (WASP)


This code uses a nonlinear simulated annealing inversion method to model slip amplitude, rake, rupture time, and rise time on a discretized fault plane, finding the solution that best fits the observations in the wavelet domain.

WASP currently accomodates observations from (1) teleseismic broadband stations, (2) regional strong-motion accelerometer stations, (3) static and high-rate Global Navigation Satellite Systems stations, and (4) Interferometric Synthetic Aperture Radar.

The code is based on the approach of Ji et al. (2002). Regional Green's functions are calculated using the method of Zhu & Rivera (2002). Details of the implementation can be found in Koch et al. (2019) and Goldberg et al. (2022).

Suggested Citation

Koch, P., Goldberg, D.E., Hunsinger, H., Melgar, D., Riquelme, S., Yeck, W.L., and Haynie, K.L., 2024, Wavelet and simulated Annealing SliP inversion (WASP), version 1.0.0: U.S. Geological Survey software release, https://doi.org/10.5066/P1EKKUNW.

Authors

References

Users of this code should consider citing the following relevant publications:

  • Ji, C., D. J. Wald, and D. V. Helmberger (2002). Source description of the 1999 Hector Mine, California, earthquake, Part I: Wavelet domain inversion theory and resolution analysis, Bulletin of the Seismological Society of America, 92, no. 4, 1192–1207, https://doi.org/10.1785/0120000916.
  • Koch, P., F. Bravo, S. Riquelme, and J. G. F. Crempien (2019). Near-real-time finite-fault inversions for large earthquakes in Chile using strong-motion data, Seismological Research Letters, 90, no. 5, 1971–1986, https://doi.org/10.1785/0220180294.
  • Goldberg, D. E., P. Koch, D. Melgar, S. Riquelme, and W. L. Yeck (2022). Beyond the Teleseism: Introducing Regional Seismic and Geodetic Data into Routine USGS FiniteFault Modeling, Seismological Research Letters, 93, 3308–3323, https://doi.org/10.1785/0220220047.
  • Zhu, L., & Rivera, L. A. (2002). A note on the dynamic and static displacements from a point source in multilayered media: A note on the dynamic and static displacements from a point source. Geophysical Journal International, 148(3), 619–627. https://doi.org/10.1046/j.1365-246X.2002.01610.x.

##Disclaimer and License Information Disclaimer License

Installation

Prerequisites

In order to compile and/or install the source code there are a number of prerequisite requirements:

  1. gfortran: To compile the code in fortran_code
  2. cmake: To compile the code in fortran_code
  3. gcc: To provide support to miniconda for compiling c code
  4. miniconda/anaconda: To install python dependencies. Conda can be installed using the provided script: conda_install.sh

Wasp Installation Scripts

Automated installation of the dependencies and fortran code has been provided in the form of the install script install.sh. Currently this install script only supports installation on linux systems as the fortran code cannot be compiled on MacOS. To instal the code please ensure that all of the prerequisites are available and miniconda/anaconda has been initialized

  1. source install.sh <path to the local neic-finitefault repository> (with other optional configurations available, run sudo bash user_install.sh -h for the help information)
    1. NOTE: The scripts in ./install.d may be run individually to suit the individuals needs. For example, to only rerun compilation of the fortran you can singularly run wasp.sh.

  2. conda activate ff-env

The following documents provide more information about the installation process:

  • Data Dependencies: Provides a list of data required to run the code
  • Code Dependencies: Provides a list of dependencies required to run the code
  • Manual Installation: Provides a list of steps to manually install dependencies and code without reference to a specific operating system.

Local Testing

Tests and linting can both be run locally:

  1. To run all python unit tests: poe test
    1. The full end to end inversion tests take a consideral amount of time to run. As a result, they are skipped by default and can be enabled by setting the following environment variables to "True"
      • RUN_ALL
      • RUN_END_TO_END
  2. To run python linting: poe lint

Using the Docker Image

This repository provides docker images for the dependencies and source code. Below are some useful commands for interacting with these images. See the Dockerfile for the build steps/configuration.

While these commands listed are docker commands. The commands should be 1 to 1 with Podman commands. Replace the word "docker" in the command with "podman".

Pulling docker images

  • docker pull <image name>
    • Names of available images:
      1. Image with python dependencies (e.g. conda environment ff-env): code.usgs.gov:5001/ghsc/neic/algorithms/neic-finitefault/wasp-python
      2. Image with all of the above (1) and the compiled fortran code: code.usgs.gov:5001/ghsc/neic/algorithms/neic-finitefault/wasp-fortran
      3. Image with all of the above (1 and 2) and the data dependencies (e.g. fd_bank): code.usgs.gov:5001/ghsc/neic/algorithms/neic-finitefault/wasp-dependencies
      4. Image with all of the above (1, 2, and 3) and the python source code: code.usgs.gov:5001/ghsc/neic/algorithms/neic-finitefault/wasp

Building the docker image locally

  1. Go to the top level of your local repository: cd <path to the local repository>
  2. Build the image: docker build .
    • Useful Optional Flags (must come before specifying the location of the Dockerfile):
      • give the image a name: -t <name>
      • build to a specific layer: --target <layer name>
        • Available layers: packages, wasp-python, wasp-fortran, wasp-dependencies, wasp
      • add a build argument: --build-arg <KEY>=<VALUE>
        • Available argument keys: FROM_IMAGE

Run the docker image interactively

  • docker run --rm -i <name> bash
    • Useful Optional Flags (must come before the name and specifying a bash shell):
      • mount the image to a local directory: -v <path to local directory>:<path to location in the docker container>
      • open a port (might be useful for jupyter notebooks that need to display on a web browser): -p <host port>: <container port>