Skip to content

Commit

Permalink
Merge branch 'main' of github.com:yt-project/yt into mask_phase_plots
Browse files Browse the repository at this point in the history
  • Loading branch information
matthewturk committed May 18, 2021
2 parents 1386e7b + ddcb73d commit 5810730
Show file tree
Hide file tree
Showing 339 changed files with 6,577 additions and 5,426 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ yt/utilities/lib/bounding_volume_hierarchy.c
yt/utilities/lib/contour_finding.c
yt/utilities/lib/cykdtree/kdtree.cpp
yt/utilities/lib/cykdtree/utils.cpp
yt/utilities/lib/cyoctree.cpp
yt/utilities/lib/cyoctree.c
yt/utilities/lib/depth_first_octree.c
yt/utilities/lib/distance_queue.c
yt/utilities/lib/element_mappings.c
Expand Down
20 changes: 15 additions & 5 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,22 +12,28 @@ yt/extern\
|yt/visualization/_colormap_data.py\
)"

repos:
ci:
autofix_prs: false

repos:
- repo: https://github.com/asottile/setup-cfg-fmt
rev: v1.17.0
hooks:
- id: setup-cfg-fmt
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.4.0
rev: v4.0.1
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: no-commit-to-branch
args: [--branch, main]
- repo: https://github.com/asottile/pyupgrade
rev: v2.11.0
rev: v2.16.0
hooks:
- id: pyupgrade
args: [--py36-plus]
- repo: https://github.com/psf/black
rev: 20.8b1
rev: 21.5b1
hooks:
- id: black
language_version: python3
Expand All @@ -37,7 +43,7 @@ repos:
- id: isort
additional_dependencies: [toml]
- repo: https://github.com/PyCQA/flake8
rev: '3.9.0'
rev: '3.9.2'
hooks:
- id: flake8
additional_dependencies: [
Expand All @@ -49,3 +55,7 @@ repos:
hooks:
- id: blacken-docs
additional_dependencies: [black==20.8b1]
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.8.0
hooks:
- id: rst-backticks
27 changes: 20 additions & 7 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -244,9 +244,9 @@ If you're interested in participating in yt development, take a look at the
<https://github.com/yt-project/yt/issues>`_.
You can search by labels, indicating estimated level of difficulty or category,
to find issues that you would like to contribute to. Good first issues are
marked with a label of `new contributor friendly`. While we try to triage the
marked with a label of *new contributor friendly*. While we try to triage the
issue tracker regularly to assign appropriate labels to every issue, it may be
the case that issues not marked as `new contributor friendly` are actually
the case that issues not marked as *new contributor friendly* are actually
suitable for new contributors.

Here are some predefined issue searches that might be useful:
Expand Down Expand Up @@ -726,8 +726,9 @@ It is recommended (though not required) that you install ``pre-commit`` on your
So that our hooks will run and update your changes on every commit.
If you do not want to/are unable to configure ``pre-commit`` on your machine, note that
after opening a pull request, a bot will run the hooks and validate your contribution by
appending commits to your branch.
after opening a pull request, it will still be run as a static checker as part of our CI.
Some hooks also come with auto-fixing capabilities, which you can trigger manually in a
PR by commenting ``pre-commit.ci run`` (see ` <https://pre-commit.ci/#features>`_).

Here's a list of the main automated formatters we use along with a short description

Expand All @@ -738,9 +739,21 @@ Here's a list of the main automated formatters we use along with a short descrip

The complete configuration is located in ``.pre-commit-config.yaml``.

.. note:: It is not recommended to run formatters directly on the command line because
versions available in your system may conflict with the ones we run through
``pre-commit`` hooks (which are updated periodically).
Note that formatters should not be run directly on the command line as, for instance

.. code-block:: bash
$ black yt
But it can still be done as

.. code-block:: bash
$ pre-commit run black --all-files
The reason is that you may have a specific version of ``black`` installed which can
produce different results, while the one that's installed with pre-commit is guaranteed
to be in sync with the rest of contributors.

Below are a list of additional guidelines for coding in yt, that are not automatically
enforced.
Expand Down
2 changes: 1 addition & 1 deletion conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ def pytest_configure(config):
# Register custom marks for answer tests and big data
config.addinivalue_line("markers", "answer_test: Run the answer tests.")
config.addinivalue_line(
"markers", "big_data: Run answer tests that require" " large data files."
"markers", "big_data: Run answer tests that require large data files."
)


Expand Down
2 changes: 1 addition & 1 deletion doc/helper_scripts/show_fields.py
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,7 @@ def print_all_fields(fl):


class FieldInfo:
""" a simple container to hold the information about fields """
"""a simple container to hold the information about fields"""

def __init__(self, ftype, field, ptype):
name = field[0]
Expand Down
12 changes: 6 additions & 6 deletions doc/source/analyzing/domain_analysis/clump_finding.rst
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ and either return True or False.
.. code:: python
def _minimum_gas_mass(clump, min_mass):
return clump["gas", "cell_mass"].sum() >= min_mass
return clump["gas", "mass"].sum() >= min_mass
add_validator("minimum_gas_mass", _minimum_gas_mass)
Expand Down Expand Up @@ -94,7 +94,7 @@ Calculating Clump Quantities

By default, a number of quantities will be calculated for each clump when the
clump finding process has finished. The default quantities are: ``total_cells``,
``cell_mass``, ``mass_weighted_jeans_mass``, ``volume_weighted_jeans_mass``,
``mass``, ``mass_weighted_jeans_mass``, ``volume_weighted_jeans_mass``,
``max_grid_level``, ``min_number_density``, and ``max_number_density``.
Additional items can be added with the
:func:`~yt.data_objects.level_sets.clump_handling.Clump.add_info_item`
Expand All @@ -115,7 +115,7 @@ of available info items by calling
def _mass_weighted_jeans_mass(clump):
jeans_mass = clump.data.quantities.weighted_average_quantity(
"jeans_mass", ("gas", "cell_mass")
"jeans_mass", ("gas", "mass")
).in_units("Msun")
return "Jeans Mass (mass-weighted): %.6e Msolar." % jeans_mass
Expand Down Expand Up @@ -215,13 +215,13 @@ The ``leaves`` attribute returns a list of all leaf clumps.
print(ds_clumps.leaves)
Info items for each clump can be accessed with the `clump` field type. Gas
or grid fields should be accessed using the `grid` field type and particle
Info items for each clump can be accessed with the ``"clump"`` field type. Gas
or grid fields should be accessed using the ``"grid"`` field type and particle
fields should be access using the specific particle type.

.. code:: python
my_clump = ds_clumps.leaves[0]
print(my_clumps["clump", "cell_mass"])
print(my_clumps["clump", "mass"])
print(my_clumps["grid", "density"])
print(my_clumps["all", "particle_mass"])
8 changes: 4 additions & 4 deletions doc/source/analyzing/domain_analysis/cosmology_calculator.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@ Cosmology Calculator
====================

The cosmology calculator can be used to calculate cosmological distances and
times given a set of cosmological parameters. A cosmological dataset, `ds`,
times given a set of cosmological parameters. A cosmological dataset, ``ds``,
will automatically have a cosmology calculator configured with the correct
parameters associated with it as `ds.cosmology`. A standalone
parameters associated with it as ``ds.cosmology``. A standalone
:class:`~yt.utilities.cosmology.Cosmology` calculator object can be created
in the following way:

Expand Down Expand Up @@ -77,8 +77,8 @@ redshift and time are available:
x, x.to("Mpc") and x.to("Mpccm") will be the same. The user should take
care to understand which reference frame is correct for the given calculation.

The helper functions, `co.quan`
and `co.arr` exist to create unitful `YTQuantities` and `YTArray` with the
The helper functions, ``co.quan``
and ``co.arr`` exist to create unitful ``YTQuantities`` and ``YTArray`` with the
unit registry of the cosmology calculator. For more information on the usage
and meaning of each calculation, consult the reference documentation at
:ref:`cosmology-calculator-ref`.
2 changes: 1 addition & 1 deletion doc/source/analyzing/domain_analysis/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -72,5 +72,5 @@ Extensions
There are a number of independent, yt-related packages for things
like visual effects, interactive widgets, synthetic absorption
spectra, X-ray observations, and merger-trees. See the
`yt Extensions <http://yt-project.org/extensions.html>` page for
`yt Extensions <http://yt-project.org/extensions.html>`_ page for
a list of available extension packages.
14 changes: 6 additions & 8 deletions doc/source/analyzing/fields.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,8 @@ for tab-completing as well as easier access to information.
Additionally, if you have `ipywidgets
<https://ipywidgets.readthedocs.io/en/stable/>`_ installed and are in a `Jupyter
environment <https://jupyter.org/>`_, you can view the rich representation of
the fields (including source code) by either typing `ds.fields` as the last
item in a cell or by calling `display(ds.fields)`. The resulting output will
the fields (including source code) by either typing ``ds.fields`` as the last
item in a cell or by calling ``display(ds.fields)``. The resulting output will
have tabs and source:

.. image:: _images/fields_ipywidget.png
Expand Down Expand Up @@ -412,6 +412,8 @@ different magnetic field units in the different :ref:`unit systems <unit_systems
determine how to set up special magnetic field handling when designing a new frontend, check out
:ref:`bfields-frontend`.

.. _species-fields:

Species Fields
--------------

Expand Down Expand Up @@ -454,7 +456,7 @@ defined:
To refer to the number density of the entirety of a single atom or molecule (regardless
of its ionization state), please use the ``MM_nuclei_density`` fields.

Finally, if the abundances of hydrogen and helium are not defined, it is assumed that
Finally, if the abundances of hydrogen and helium are not defined, it is
assumed that these elements are fully ionized with primordial abundances In this case,
the following fields are defined:

Expand Down Expand Up @@ -607,7 +609,7 @@ this will be handled automatically:
)
Which, because the axis is ``"z"``, will give you the same result if you had
projected the `"velocity_z"`` field. This also works for off-axis projections:
projected the ``"velocity_z"`` field. This also works for off-axis projections:

.. code-block:: python
Expand Down Expand Up @@ -806,10 +808,6 @@ default this is 64, but it can be supplied as the final argument to
``add_nearest_neighbor_field``. For the example above, it would be
``nearest_neighbor_64``.

This can then be used as input to the function
``add_volume_weighted_smoothed_field``, which can enable smoothing particle
types that would normally not be smoothed.

Commonly, not just the identity of the nearest particle is interesting, but the
value of a given field associated with that particle. yt provides a function
that can do this, as well. This deposits into the indexing octree the value
Expand Down
26 changes: 13 additions & 13 deletions doc/source/analyzing/filtering.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Filters can be generated based on spatial position, say in a sphere
in the center of your dataset space, or more generally they can be
defined by the properties of any field in the simulation.

Because `mesh fields` are internally different from `particle fields`,
Because *mesh fields* are internally different from *particle fields*,
there are different ways of filtering each type as indicated below;
however, filtering fields by spatial location (i.e. geometric
objects) will apply to both types equally.
Expand Down Expand Up @@ -54,11 +54,11 @@ set a simple mask based on the contents of one of our fields.
import yt
ds = yt.load('Enzo_64/DD0042/data0042')
ad = ds.all_data()
hot = ad["temperature"].in_units('K') > 1e6
print('Temperature of all data: ad["temperature"] = \n%s' % ad["temperature"])
hot = ad[("gas", "temperature")].in_units('K') > 1e6
print('Temperature of all data: ad[("gas", "temperature")] = \n%s' % ad[("gas", "temperature")])
print("Boolean Mask: hot = \n%s" % hot)
print('Temperature of "hot" data: ad["temperature"][hot] = \n%s' %
ad['temperature'][hot])
print('Temperature of "hot" data: ad[("gas", "temperature")][hot] = \n%s' %
ad[("gas", "temperature")][hot])

This was a simple example, but one can make the conditionals that define
a boolean mask have multiple parts, and one can stack masks together to
Expand All @@ -71,9 +71,9 @@ used if you simply need to access the NumPy arrays:
ds = yt.load('Enzo_64/DD0042/data0042')
ad = ds.all_data()
overpressure_and_fast = (ad["pressure"] > 1e-14) & (ad["velocity_magnitude"].in_units('km/s') > 1e2)
print('Density of all data: ad["density"] = \n%s' % ad['density'])
print('Density of "overpressure and fast" data: overpressure_and_fast['density'] = \n%s' %
overpressure_and_fast['density'])
print('Density of all data: ad[("gas", "density")] = \n%s' % ad[("gas", "density")])
print('Density of "overpressure and fast" data: overpressure_and_fast[("gas", "density")] = \n%s' %
overpressure_and_fast[("gas", "density")])

.. _cut-regions:

Expand All @@ -96,13 +96,13 @@ filtering out unwanted regions. Such wrapper functions are methods of
import yt
ds = yt.load('Enzo_64/DD0042/data0042')
ad = ds.all_data()
overpressure_and_fast = ad.include_above('pressure', 1e-14)
overpressure_and_fast = ad.include_above(("gas", "pressure"), 1e-14)
# You can chain include_xx and exclude_xx to produce the intersection of cut regions
overpressure_and_fast = overpressure_and_fast.include_above('velocity_magnitude', 1e2, 'km/s')
overpressure_and_fast = overpressure_and_fast.include_above(("gas", "velocity_magnitude"), 1e2, 'km/s')

print('Density of all data: ad["density"] = \n%s' % ad['density'])
print('Density of "overpressure and fast" data: overpressure_and_fast["density"] = \n%s' %
overpressure_and_fast['density'])
print('Density of all data: ad[("gas", "density")] = \n%s' % ad[("gas", density")])
print('Density of "overpressure and fast" data: overpressure_and_fast[("gas", "density")] = \n%s' %
overpressure_and_fast[("gas", "density")])

The following exclude and include functions are supported:
- :func:`~yt.data_objects.data_containers.YTSelectionContainer3D.include_equal` - Only include values equal to given value
Expand Down
23 changes: 13 additions & 10 deletions doc/source/analyzing/generating_processed_data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,7 @@ density within a sphere can be created in the following way:
("gas", "temperature"), # profile field
("gas", "radial_velocity"),
], # profile field
weight_field=("gas", "cell_mass"),
weight_field=("gas", "mass"),
)
The binning, weight, and profile data can now be access as:
Expand Down Expand Up @@ -180,8 +180,11 @@ temperature can be created as follows:
.. code-block:: python
profile2d = source.profile(
[("gas", "density"), ("gas", "temperature")], # the x bin field # the y bin field
[("gas", "cell_mass")], # the profile field
[
("gas", "density"),
("gas", "temperature"),
], # the x bin field # the y bin field
[("gas", "mass")], # the profile field
weight_field=None,
)
Expand All @@ -191,7 +194,7 @@ Accessing the x, y, and profile fields work just as with one-dimensional profile
print(profile2d.x)
print(profile2d.y)
print(profile2d["gas", "cell_mass"])
print(profile2d["gas", "mass"])
One of the more interesting things that is enabled with this approach is
the generation of 1D profiles that correspond to 2D profiles. For instance, a
Expand Down Expand Up @@ -219,7 +222,7 @@ for each bin field or ``None`` to use the default settings.
custom_bins = np.array([1e-27, 1e-25, 2e-25, 5e-25, 1e-23])
profile2d = source.profile(
[("gas", "density"), ("gas", "temperature")],
[("gas", "cell_mass")],
[("gas", "mass")],
override_bins={("gas", "density"): custom_bins, ("gas", "temperature"): None},
)
Expand All @@ -230,8 +233,8 @@ Exporting Profiles to DataFrame

One-dimensional profile data can be exported to a :class:`~pandas.DataFrame` object
using the :meth:`yt.data_objects.profiles.Profile1D.to_dataframe` method. Bins which
do not have data will have their fields filled with `NaN`s, except for the bin field
itself. If you only want to export the bins which are used, set `only_used=True`.
do not have data will have their fields filled with ``NaN``, except for the bin field
itself. If you only want to export the bins which are used, set ``only_used=True``.

.. code-block:: python
Expand All @@ -253,8 +256,8 @@ Exporting Profiles to QTable
One-dimensional profile data also can be exported to an AstroPy :class:`~astropy.table.QTable`
object. This table can then be written to disk in a number of formats, such as ASCII text
or FITS files, and manipulated in a number of ways. Bins which do not have data
will have their mask values set to `False`. If you only want to export the bins
which are used, set `only_used=True`. Units are preserved in the table by converting
will have their mask values set to ``False``. If you only want to export the bins
which are used, set ``only_used=True``. Units are preserved in the table by converting
each :class:`~yt.units.yt_array.YTArray` to an :class:`~astropy.units.Quantity`.

To export the 1D profile to a Table object, simply call
Expand Down Expand Up @@ -330,4 +333,4 @@ interoperability with anything that can take xarray data. The classes that can
grid = ds.r[::256j, ::256j, ::256j]
obj = grid.to_xarray(fields=["density", "temperature"])
The returned object, `obj`, will now have the correct labeled axes and so forth.
The returned object, ``obj``, will now have the correct labeled axes and so forth.
4 changes: 3 additions & 1 deletion doc/source/analyzing/ionization_cube.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,9 @@
name="IonizedHydrogen", units="", display_name=r"\frac{\rho_{HII}}{\rho_H}"
)
def IonizedHydrogen(field, data):
return data["HII_Density"] / (data["HI_Density"] + data["HII_Density"])
return data[("gas", "HII_Density")] / (
data[("gas", "HI_Density")] + data[("gas", "HII_Density")]
)


ts = yt.DatasetSeries("SED800/DD*/*.index", parallel=8)
Expand Down
Loading

0 comments on commit 5810730

Please sign in to comment.