From 5f1cef554d3bb0c46f3bc700618327b7a86d740a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Cl=C3=A9ment=20Robert?= Date: Wed, 13 Oct 2021 18:49:46 +0200 Subject: [PATCH] DOC: reactivate rst-backticks pre-commit hook and fix nits --- .pre-commit-config.yaml | 8 ++++---- doc/source/halo_catalog.rst | 18 +++++++++--------- doc/source/halo_finding.rst | 14 +++++++------- 3 files changed, 20 insertions(+), 20 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index fec86630..705ad942 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -47,7 +47,7 @@ repos: hooks: - id: blacken-docs additional_dependencies: [black==21.8b0] -#- repo: https://github.com/pre-commit/pygrep-hooks -# rev: v1.9.0 -# hooks: -# - id: rst-backticks +- repo: https://github.com/pre-commit/pygrep-hooks + rev: v1.9.0 + hooks: + - id: rst-backticks diff --git a/doc/source/halo_catalog.rst b/doc/source/halo_catalog.rst index af54a58a..4cfd5c37 100644 --- a/doc/source/halo_catalog.rst +++ b/doc/source/halo_catalog.rst @@ -251,7 +251,7 @@ analysis begins with a call to hc.create() -The `save_halos` keyword determines whether the actual Halo objects +The ``save_halos`` keyword determines whether the actual Halo objects are saved after analysis on them has completed or whether just the contents of their quantities dicts will be retained for creating the final catalog. The looping over halos uses a call to parallel_objects @@ -274,7 +274,7 @@ Parallelism Halo analysis using the :class:`~yt_astro_analysis.halo_analysis.halo_catalog.halo_catalog.HaloCatalog` can be parallelized by adding ``yt.enable_parallelism()`` to the top of the -script and running with `mpirun`. +script and running with ``mpirun``. .. code-block:: python @@ -290,17 +290,17 @@ script and running with `mpirun`. The nature of the parallelism can be configured with two keywords provided to the :meth:`~yt_astro_analysis.halo_analysis.halo_catalog.halo_catalog.HaloCatalog.create` -function: `njobs` and `dynamic`. If `dynamic` is set to False, halos will be -distributed evenly over all processors. If `dynamic` is set to True, halos -will be allocated to processors via a task queue. The `njobs` keyword determines +function: ``njobs`` and ``dynamic``. If ``dynamic`` is set to False, halos will be +distributed evenly over all processors. If ```dynamic`` is set to True, halos +will be allocated to processors via a task queue. The ``njobs`` keyword determines the number of processor groups over which the analysis will be divided. The -default value for `njobs` is "auto". In this mode, a single processor will be -allocated to analyze a halo. The `dynamic` keyword is overridden to False if +default value for ``njobs`` is "auto". In this mode, a single processor will be +allocated to analyze a halo. The ``dynamic`` keyword is overridden to False if the number of processors being used is even and True (use a task queue) if odd. -Set `njobs` to -1 to mandate a single processor to analyze a halo and to a positive +Set ``njobs`` to -1 to mandate a single processor to analyze a halo and to a positive number to create that many processor groups for performing analysis. The number of processors used per halo will then be the total number of processors divided by -`njobs`. For more information on running ``yt`` in parallel, see +``njobs``. For more information on running ``yt`` in parallel, see :ref:`parallel-computation`. Loading Created Halo Catalogs diff --git a/doc/source/halo_finding.rst b/doc/source/halo_finding.rst index 59136c07..7e21e214 100644 --- a/doc/source/halo_finding.rst +++ b/doc/source/halo_finding.rst @@ -133,7 +133,7 @@ to a directory associated with the ``output_dir`` keyword provided to the :class:`~yt_astro_analysis.halo_analysis.halo_catalog.halo_catalog.HaloCatalog`. The number of files for each catalog is equal to the number of processors used. The catalog files have the naming convention -`/..h5`, where `dataset_name` refers +``/..h5``, where ``dataset_name`` refers to the name of the snapshot. For more information on loading these with yt, see :ref:`halocatalog`. @@ -153,7 +153,7 @@ The ``yt_astro_analysis`` package works with the latest version of obtaining and installing ``rockstar-galaxies`` for use with ``yt_astro_analysis``. -To run Rockstar, your script must be run with `mpirun` using a minimum of three +To run Rockstar, your script must be run with ``mpirun`` using a minimum of three processors. Rockstar processes are divided into three groups: * readers: these read particle data from the snapshots. Set the number of readers @@ -162,7 +162,7 @@ processors. Rockstar processes are divided into three groups: Set the number of writers with the ``num_writers`` keyword argument. * server: this process coordinates the activity of the readers and writers. There is only one server process. The total number of processes given with - `mpirun` must be equal to the number of readers plus writers plus one + ``mpirun`` must be equal to the number of readers plus writers plus one (for the server). .. code-block:: python @@ -196,19 +196,19 @@ keyword provided to the :class:`~yt_astro_analysis.halo_analysis.halo_catalog.halo_catalog.HaloCatalog`. The number of files for each catalog is equal to the number of writers. The catalog files have the naming convention -`halos_..bin`, where catalog number 0 is the +``halos_..bin``, where catalog number 0 is the first halo catalog calculated. For more information on loading these with yt, see :ref:`rockstar`. Parallelism ----------- -All three halo finders can be run in parallel using `mpirun` and by adding +All three halo finders can be run in parallel using ``mpirun`` and by adding ``yt.enable_parallelism()`` to the top of the script. The computational domain will be divided evenly among all processes (among the writers in the case of Rockstar) with a small amount of padding to ensure halos on sub-volume boundaries are not split. For FoF and HOP, the number of processors used only -needs to provided to `mpirun` (e.g., `mpirun -np 8` to run on 8 processors). +needs to provided to ``mpirun`` (e.g., ``mpirun -np 8`` to run on 8 processors). .. code-block:: python @@ -238,6 +238,6 @@ belonging to each halo can be saved to the catalog when using either the :ref:`fof_finding` or :ref:`hop_finding` methods. The is enabled by default and can be disabled by setting ``save_particles`` to ``False`` in the ``finder_kwargs`` dictionary, as described above. Rockstar will also save -halo particles to the `.bin` files. However, reading these is not currently +halo particles to the ``.bin`` files. However, reading these is not currently supported in yt. See :ref:`halocatalog` for information on accessing halo particles for FoF and HOP catalogs.