Skip to content

Commit

Permalink
Merge branch 'main' into docs_internal_design
Browse files Browse the repository at this point in the history
  • Loading branch information
max-sixty authored Sep 7, 2023
2 parents d00e8a3 + e2b6f34 commit 440f95a
Show file tree
Hide file tree
Showing 36 changed files with 628 additions and 262 deletions.
34 changes: 28 additions & 6 deletions .codecov.yml
Original file line number Diff line number Diff line change
@@ -1,16 +1,38 @@
codecov:
ci:
# by default, codecov doesn't recognize azure as a CI provider
- dev.azure.com
require_ci_to_pass: yes
require_ci_to_pass: true

coverage:
status:
project:
default:
# Require 1% coverage, i.e., always succeed
target: 1
target: 1%
flags:
- unittests
paths:
- "!xarray/tests/"
unittests:
target: 90%
flags:
- unittests
paths:
- "!xarray/tests/"
mypy:
target: 20%
flags:
- mypy
patch: false
changes: false

comment: off
comment: false

flags:
unittests:
paths:
- "xarray"
- "!xarray/tests"
carryforward: false
mypy:
paths:
- "xarray"
carryforward: false
4 changes: 2 additions & 2 deletions .github/workflows/ci-additional.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ jobs:
- name: Run mypy
run: |
python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report
python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report xarray/
- name: Upload mypy coverage to Codecov
uses: codecov/codecov-action@v3.1.4
Expand Down Expand Up @@ -177,7 +177,7 @@ jobs:
- name: Run mypy
run: |
python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report
python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report xarray/
- name: Upload mypy coverage to Codecov
uses: codecov/codecov-action@v3.1.4
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ jobs:
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
# Bookend python versions
python-version: ["3.9", "3.10", "3.11"]
python-version: ["3.9", "3.11"]
env: [""]
include:
# Minimum python version:
Expand Down
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ repos:
files: ^xarray/
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: 'v0.0.282'
rev: 'v0.0.287'
hooks:
- id: ruff
args: ["--fix"]
Expand All @@ -35,7 +35,7 @@ repos:
additional_dependencies: ["black==23.7.0"]
- id: blackdoc-autoupdate-black
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.4.1
rev: v1.5.1
hooks:
- id: mypy
# Copied from setup.cfg
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# xarray: N-D labeled arrays and datasets

[![CI](https://github.com/pydata/xarray/workflows/CI/badge.svg?branch=main)](https://github.com/pydata/xarray/actions?query=workflow%3ACI)
[![Code coverage](https://codecov.io/gh/pydata/xarray/branch/main/graph/badge.svg)](https://codecov.io/gh/pydata/xarray)
[![Code coverage](https://codecov.io/gh/pydata/xarray/branch/main/graph/badge.svg?flag=unittests)](https://codecov.io/gh/pydata/xarray)
[![Docs](https://readthedocs.org/projects/xray/badge/?version=latest)](https://docs.xarray.dev/)
[![Benchmarked with asv](https://img.shields.io/badge/benchmarked%20by-asv-green.svg?style=flat)](https://pandas.pydata.org/speed/xarray/)
[![Available on pypi](https://img.shields.io/pypi/v/xarray.svg)](https://pypi.python.org/pypi/xarray/)
Expand Down
5 changes: 4 additions & 1 deletion doc/api-hidden.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
Coordinates.to_dataset
Coordinates.to_index
Coordinates.update
Coordinates.assign
Coordinates.merge
Coordinates.copy
Coordinates.equals
Expand All @@ -39,8 +40,9 @@
core.coordinates.DatasetCoordinates.to_dataset
core.coordinates.DatasetCoordinates.to_index
core.coordinates.DatasetCoordinates.update
core.coordinates.DatasetCoordinates.assign
core.coordinates.DatasetCoordinates.merge
core.coordinates.DataArrayCoordinates.copy
core.coordinates.DatasetCoordinates.copy
core.coordinates.DatasetCoordinates.equals
core.coordinates.DatasetCoordinates.identical

Expand Down Expand Up @@ -79,6 +81,7 @@
core.coordinates.DataArrayCoordinates.to_dataset
core.coordinates.DataArrayCoordinates.to_index
core.coordinates.DataArrayCoordinates.update
core.coordinates.DataArrayCoordinates.assign
core.coordinates.DataArrayCoordinates.merge
core.coordinates.DataArrayCoordinates.copy
core.coordinates.DataArrayCoordinates.equals
Expand Down
53 changes: 52 additions & 1 deletion doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,58 @@ What's New
np.random.seed(123456)
.. _whats-new.2023.08.1:

v2023.08.1 (unreleased)
-----------------------

New Features
~~~~~~~~~~~~

- Added the :py:meth:`Coordinates.assign` method that can be used to combine
different collections of coordinates prior to assign them to a Dataset or
DataArray (:pull:`8102`) at once.
By `Benoît Bovy <https://github.com/benbovy>`_.

Breaking changes
~~~~~~~~~~~~~~~~

- The :py:class:`Coordinates` constructor now creates a (pandas) index by
default for each dimension coordinate. To keep the previous behavior (no index
created), pass an empty dictionary to ``indexes``. The constructor now also
extracts and add the indexes from another :py:class:`Coordinates` object
passed via ``coords`` (:pull:`8107`).
By `Benoît Bovy <https://github.com/benbovy>`_.

Deprecations
~~~~~~~~~~~~

- Deprecate passing a :py:class:`pandas.MultiIndex` object directly to the
:py:class:`Dataset` and :py:class:`DataArray` constructors as well as to
:py:meth:`Dataset.assign` and :py:meth:`Dataset.assign_coords`.
A new Xarray :py:class:`Coordinates` object has to be created first using
:py:meth:`Coordinates.from_pandas_multiindex` (:pull:`8094`).
By `Benoît Bovy <https://github.com/benbovy>`_.

Bug fixes
~~~~~~~~~

- Improved handling of multi-coordinate indexes when updating coordinates, including bug fixes
(and improved warnings for deprecated features) for pandas multi-indexes (:pull:`8094`).
By `Benoît Bovy <https://github.com/benbovy>`_.
- Fixed a bug in :py:func:`merge` with ``compat='minimal'`` where the coordinate
names were not updated properly internally (:issue:`7405`, :issue:`7588`,
:pull:`8104`).
By `Benoît Bovy <https://github.com/benbovy>`_.

Documentation
~~~~~~~~~~~~~


Internal Changes
~~~~~~~~~~~~~~~~


.. _whats-new.2023.08.0:

v2023.08.0 (Aug 18, 2023)
Expand Down Expand Up @@ -90,7 +142,6 @@ Breaking changes
numbagg 0.1 0.2.1
===================== ========= ========


Documentation
~~~~~~~~~~~~~

Expand Down
15 changes: 12 additions & 3 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -930,7 +930,9 @@ def open_mfdataset(
If a callable, it must expect a sequence of ``attrs`` dicts and a context object
as its only parameters.
**kwargs : optional
Additional arguments passed on to :py:func:`xarray.open_dataset`.
Additional arguments passed on to :py:func:`xarray.open_dataset`. For an
overview of some of the possible options, see the documentation of
:py:func:`xarray.open_dataset`
Returns
-------
Expand Down Expand Up @@ -965,6 +967,13 @@ def open_mfdataset(
... "file_*.nc", concat_dim="time", preprocess=partial_func
... ) # doctest: +SKIP
It is also possible to use any argument to ``open_dataset`` together
with ``open_mfdataset``, such as for example ``drop_variables``:
>>> ds = xr.open_mfdataset(
... "file.nc", drop_variables=["varname_1", "varname_2"] # any list of vars
... ) # doctest: +SKIP
References
----------
Expand Down Expand Up @@ -1047,8 +1056,8 @@ def open_mfdataset(
)
else:
raise ValueError(
"{} is an invalid option for the keyword argument"
" ``combine``".format(combine)
f"{combine} is an invalid option for the keyword argument"
" ``combine``"
)
except ValueError:
for ds in datasets:
Expand Down
8 changes: 4 additions & 4 deletions xarray/coding/cftime_offsets.py
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ def __init__(self, n: int = 1):
if not isinstance(n, int):
raise TypeError(
"The provided multiple 'n' must be an integer. "
"Instead a value of type {!r} was provided.".format(type(n))
f"Instead a value of type {type(n)!r} was provided."
)
self.n = n

Expand Down Expand Up @@ -353,13 +353,13 @@ def _validate_month(month, default_month):
raise TypeError(
"'self.month' must be an integer value between 1 "
"and 12. Instead, it was set to a value of "
"{!r}".format(result_month)
f"{result_month!r}"
)
elif not (1 <= result_month <= 12):
raise ValueError(
"'self.month' must be an integer value between 1 "
"and 12. Instead, it was set to a value of "
"{!r}".format(result_month)
f"{result_month!r}"
)
return result_month

Expand Down Expand Up @@ -771,7 +771,7 @@ def to_cftime_datetime(date_str_or_date, calendar=None):
raise TypeError(
"date_str_or_date must be a string or a "
"subclass of cftime.datetime. Instead got "
"{!r}.".format(date_str_or_date)
f"{date_str_or_date!r}."
)


Expand Down
11 changes: 5 additions & 6 deletions xarray/coding/cftimeindex.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,12 +228,12 @@ def assert_all_valid_date_type(data):
if not isinstance(sample, cftime.datetime):
raise TypeError(
"CFTimeIndex requires cftime.datetime "
"objects. Got object of {}.".format(date_type)
f"objects. Got object of {date_type}."
)
if not all(isinstance(value, date_type) for value in data):
raise TypeError(
"CFTimeIndex requires using datetime "
"objects of all the same type. Got\n{}.".format(data)
f"objects of all the same type. Got\n{data}."
)


Expand Down Expand Up @@ -553,8 +553,7 @@ def shift(self, n: int | float, freq: str | timedelta):
return self + n * to_offset(freq)
else:
raise TypeError(
"'freq' must be of type "
"str or datetime.timedelta, got {}.".format(freq)
"'freq' must be of type " f"str or datetime.timedelta, got {freq}."
)

def __add__(self, other):
Expand Down Expand Up @@ -636,10 +635,10 @@ def to_datetimeindex(self, unsafe=False):
if calendar not in _STANDARD_CALENDARS and not unsafe:
warnings.warn(
"Converting a CFTimeIndex with dates from a non-standard "
"calendar, {!r}, to a pandas.DatetimeIndex, which uses dates "
f"calendar, {calendar!r}, to a pandas.DatetimeIndex, which uses dates "
"from the standard calendar. This may lead to subtle errors "
"in operations that depend on the length of time between "
"dates.".format(calendar),
"dates.",
RuntimeWarning,
stacklevel=2,
)
Expand Down
6 changes: 3 additions & 3 deletions xarray/coding/strings.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,9 +59,9 @@ def encode(self, variable, name=None):
if contains_unicode and (encode_as_char or not self.allows_unicode):
if "_FillValue" in attrs:
raise NotImplementedError(
"variable {!r} has a _FillValue specified, but "
f"variable {name!r} has a _FillValue specified, but "
"_FillValue is not yet supported on unicode strings: "
"https://github.com/pydata/xarray/issues/1647".format(name)
"https://github.com/pydata/xarray/issues/1647"
)

string_encoding = encoding.pop("_Encoding", "utf-8")
Expand Down Expand Up @@ -176,7 +176,7 @@ def char_to_bytes(arr):
if len(arr.chunks[-1]) > 1:
raise ValueError(
"cannot stacked dask character array with "
"multiple chunks in the last dimension: {}".format(arr)
f"multiple chunks in the last dimension: {arr}"
)

dtype = np.dtype("S" + str(arr.shape[-1]))
Expand Down
14 changes: 6 additions & 8 deletions xarray/coding/times.py
Original file line number Diff line number Diff line change
Expand Up @@ -218,8 +218,8 @@ def _decode_datetime_with_pandas(
) -> np.ndarray:
if not _is_standard_calendar(calendar):
raise OutOfBoundsDatetime(
"Cannot decode times from a non-standard calendar, {!r}, using "
"pandas.".format(calendar)
f"Cannot decode times from a non-standard calendar, {calendar!r}, using "
"pandas."
)

delta, ref_date = _unpack_netcdf_time_units(units)
Expand Down Expand Up @@ -452,8 +452,8 @@ def cftime_to_nptime(times, raise_on_invalid: bool = True) -> np.ndarray:
except ValueError as e:
if raise_on_invalid:
raise ValueError(
"Cannot convert date {} to a date in the "
"standard calendar. Reason: {}.".format(t, e)
f"Cannot convert date {t} to a date in the "
f"standard calendar. Reason: {e}."
)
else:
dt = "NaT"
Expand Down Expand Up @@ -485,10 +485,8 @@ def convert_times(times, date_type, raise_on_invalid: bool = True) -> np.ndarray
except ValueError as e:
if raise_on_invalid:
raise ValueError(
"Cannot convert date {} to a date in the "
"{} calendar. Reason: {}.".format(
t, date_type(2000, 1, 1).calendar, e
)
f"Cannot convert date {t} to a date in the "
f"{date_type(2000, 1, 1).calendar} calendar. Reason: {e}."
)
else:
dt = np.NaN
Expand Down
4 changes: 2 additions & 2 deletions xarray/coding/variables.py
Original file line number Diff line number Diff line change
Expand Up @@ -179,10 +179,10 @@ def safe_setitem(dest, key: Hashable, value, name: T_Name = None):
if key in dest:
var_str = f" on variable {name!r}" if name else ""
raise ValueError(
"failed to prevent overwriting existing key {} in attrs{}. "
f"failed to prevent overwriting existing key {key} in attrs{var_str}. "
"This is probably an encoding field used by xarray to describe "
"how a variable is serialized. To proceed, remove this key from "
"the variable's attributes manually.".format(key, var_str)
"the variable's attributes manually."
)
dest[key] = value

Expand Down
Loading

0 comments on commit 440f95a

Please sign in to comment.