Skip to content
forked from pydata/xarray

Commit

Permalink
Merge remote-tracking branch 'upstream/main' into groupby-reduce
Browse files Browse the repository at this point in the history
* upstream/main:
  whats-new dev
  whats-new for 0.20.1 (pydata#5943)
  Docs: fix URL for PTSA (pydata#5935)
  Fix a missing @requires_zarr in tests (pydata#5936)
  fix the detection of backend entrypoints (pydata#5931)
  Explicitly list all reductions in api.rst (pydata#5903)
  DOC: add names of missing contributors to 0.20.0 (pydata#5932)
  new whats-new.rst section
  Update open_rasterio deprecation version number (pydata#5916)
  v0.20 Release notes (pydata#5924)
  [skip-ci] v0.20.0: whats-new for release (pydata#5905)
  Update minimum dependencies for 0.20 (pydata#5917)
  Bump actions/github-script from 4.1 to 5 (pydata#5826)
  remove requirement for setuptools.pkg_resources (pydata#5845)
  Update docstring for apply_ufunc, set_options (pydata#5904)
  Display coords' units for slice plots (pydata#5847)
  Combine by coords dataarray bugfix (pydata#5834)
  Add .chunksizes property (pydata#5900)
  • Loading branch information
dcherian committed Nov 5, 2021
2 parents 8106a4a + f469e22 commit 5cabd19
Show file tree
Hide file tree
Showing 44 changed files with 964 additions and 652 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/upstream-dev-ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ jobs:
shopt -s globstar
python .github/workflows/parse_logs.py logs/**/*-log
- name: Report failures
uses: actions/github-script@v4.1
uses: actions/github-script@v5
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
Expand Down Expand Up @@ -158,15 +158,15 @@ jobs:
// If no issue is open, create a new issue,
// else update the body of the existing issue.
if (result.repository.issues.edges.length === 0) {
github.issues.create({
github.rest.issues.create({
owner: variables.owner,
repo: variables.name,
body: issue_body,
title: title,
labels: [variables.label]
})
} else {
github.issues.update({
github.rest.issues.update({
owner: variables.owner,
repo: variables.name,
issue_number: result.repository.issues.edges[0].node.number,
Expand Down
14 changes: 3 additions & 11 deletions ci/min_deps_check.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,24 +20,16 @@
"isort",
"mypy",
"pip",
"setuptools",
"pytest",
"pytest-cov",
"pytest-env",
"pytest-xdist",
}

POLICY_MONTHS = {"python": 24, "numpy": 18, "setuptools": 42}
POLICY_MONTHS = {"python": 24, "numpy": 18}
POLICY_MONTHS_DEFAULT = 12
POLICY_OVERRIDE = {
# setuptools-scm doesn't work with setuptools < 36.7 (Nov 2017).
# The conda metadata is malformed for setuptools < 38.4 (Jan 2018)
# (it's missing a timestamp which prevents this tool from working).
# setuptools < 40.4 (Sep 2018) from conda-forge cannot be installed into a py37
# environment
# TODO remove this special case and the matching note in installing.rst
# after March 2022.
"setuptools": (40, 4),
}
POLICY_OVERRIDE: Dict[str, Tuple[int, int]] = {}
has_errors = False


Expand Down
6 changes: 3 additions & 3 deletions ci/requirements/py37-bare-minimum.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ dependencies:
- pytest-cov
- pytest-env
- pytest-xdist
- numpy=1.17
- pandas=1.0
- setuptools=40.4
- numpy=1.18
- pandas=1.1
- typing_extensions=3.7
- importlib-metadata=2.0
32 changes: 17 additions & 15 deletions ci/requirements/py37-min-all-deps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,29 +10,32 @@ dependencies:
- python=3.7
- boto3=1.13
- bottleneck=1.3
# cartopy 0.18 conflicts with pynio
- cartopy=0.17
- cdms2=3.1
- cfgrib=0.9
- cftime=1.1
- cftime=1.2
- coveralls
- dask=2.24
- distributed=2.24
- dask=2.30
- distributed=2.30
- h5netcdf=0.8
- h5py=2.10
# hdf5 1.12 conflicts with h5py=2.10
- hdf5=1.10
- hypothesis
- iris=2.4
- lxml=4.5 # Optional dep of pydap
- matplotlib-base=3.2
- importlib-metadata=2.0
- lxml=4.6 # Optional dep of pydap
- matplotlib-base=3.3
- nc-time-axis=1.2
# netcdf follows a 1.major.minor[.patch] convention
# (see https://github.com/Unidata/netcdf4-python/issues/1090)
# bumping the netCDF4 version is currently blocked by #4491
- netcdf4=1.5.3
- numba=0.49
- numpy=1.17
- pandas=1.0
- pint=0.15
- numba=0.51
- numpy=1.18
- pandas=1.1
- pint=0.16
- pip
- pseudonetcdf=3.1
- pydap=3.2
Expand All @@ -42,12 +45,11 @@ dependencies:
- pytest-env
- pytest-xdist
- rasterio=1.1
- scipy=1.4
- seaborn=0.10
- setuptools=40.4
- sparse=0.8
- toolz=0.10
- scipy=1.5
- seaborn=0.11
- sparse=0.11
- toolz=0.11
- typing_extensions=3.7
- zarr=2.4
- zarr=2.5
- pip:
- numbagg==0.1
208 changes: 0 additions & 208 deletions doc/api-hidden.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,24 +9,6 @@
.. autosummary::
:toctree: generated/

Dataset.nbytes
Dataset.chunks

Dataset.all
Dataset.any
Dataset.argmax
Dataset.argmin
Dataset.idxmax
Dataset.idxmin
Dataset.max
Dataset.min
Dataset.mean
Dataset.median
Dataset.prod
Dataset.sum
Dataset.std
Dataset.var

core.coordinates.DatasetCoordinates.get
core.coordinates.DatasetCoordinates.items
core.coordinates.DatasetCoordinates.keys
Expand All @@ -39,84 +21,13 @@
core.coordinates.DatasetCoordinates.indexes
core.coordinates.DatasetCoordinates.variables

core.rolling.DatasetCoarsen.all
core.rolling.DatasetCoarsen.any
core.rolling.DatasetCoarsen.construct
core.rolling.DatasetCoarsen.count
core.rolling.DatasetCoarsen.max
core.rolling.DatasetCoarsen.mean
core.rolling.DatasetCoarsen.median
core.rolling.DatasetCoarsen.min
core.rolling.DatasetCoarsen.prod
core.rolling.DatasetCoarsen.reduce
core.rolling.DatasetCoarsen.std
core.rolling.DatasetCoarsen.sum
core.rolling.DatasetCoarsen.var
core.rolling.DatasetCoarsen.boundary
core.rolling.DatasetCoarsen.coord_func
core.rolling.DatasetCoarsen.obj
core.rolling.DatasetCoarsen.side
core.rolling.DatasetCoarsen.trim_excess
core.rolling.DatasetCoarsen.windows

core.groupby.DatasetGroupBy.assign
core.groupby.DatasetGroupBy.assign_coords
core.groupby.DatasetGroupBy.first
core.groupby.DatasetGroupBy.last
core.groupby.DatasetGroupBy.fillna
core.groupby.DatasetGroupBy.quantile
core.groupby.DatasetGroupBy.where
core.groupby.DatasetGroupBy.all
core.groupby.DatasetGroupBy.any
core.groupby.DatasetGroupBy.count
core.groupby.DatasetGroupBy.max
core.groupby.DatasetGroupBy.mean
core.groupby.DatasetGroupBy.median
core.groupby.DatasetGroupBy.min
core.groupby.DatasetGroupBy.prod
core.groupby.DatasetGroupBy.std
core.groupby.DatasetGroupBy.sum
core.groupby.DatasetGroupBy.var
core.groupby.DatasetGroupBy.dims
core.groupby.DatasetGroupBy.groups

core.resample.DatasetResample.all
core.resample.DatasetResample.any
core.resample.DatasetResample.apply
core.resample.DatasetResample.assign
core.resample.DatasetResample.assign_coords
core.resample.DatasetResample.bfill
core.resample.DatasetResample.count
core.resample.DatasetResample.ffill
core.resample.DatasetResample.fillna
core.resample.DatasetResample.first
core.resample.DatasetResample.last
core.resample.DatasetResample.map
core.resample.DatasetResample.max
core.resample.DatasetResample.mean
core.resample.DatasetResample.median
core.resample.DatasetResample.min
core.resample.DatasetResample.prod
core.resample.DatasetResample.quantile
core.resample.DatasetResample.reduce
core.resample.DatasetResample.std
core.resample.DatasetResample.sum
core.resample.DatasetResample.var
core.resample.DatasetResample.where
core.resample.DatasetResample.dims
core.resample.DatasetResample.groups

core.rolling.DatasetRolling.argmax
core.rolling.DatasetRolling.argmin
core.rolling.DatasetRolling.count
core.rolling.DatasetRolling.max
core.rolling.DatasetRolling.mean
core.rolling.DatasetRolling.median
core.rolling.DatasetRolling.min
core.rolling.DatasetRolling.prod
core.rolling.DatasetRolling.std
core.rolling.DatasetRolling.sum
core.rolling.DatasetRolling.var
core.rolling.DatasetRolling.center
core.rolling.DatasetRolling.dim
core.rolling.DatasetRolling.min_periods
Expand All @@ -127,49 +38,12 @@
core.weighted.DatasetWeighted.obj
core.weighted.DatasetWeighted.weights

core.rolling_exp.RollingExp.mean

Dataset.argsort
Dataset.astype
Dataset.clip
Dataset.conj
Dataset.conjugate
Dataset.imag
Dataset.round
Dataset.real
Dataset.cumsum
Dataset.cumprod
Dataset.rank

Dataset.load_store
Dataset.dump_to_store

DataArray.ndim
DataArray.nbytes
DataArray.shape
DataArray.size
DataArray.dtype
DataArray.nbytes
DataArray.chunks

DataArray.astype
DataArray.item

DataArray.all
DataArray.any
DataArray.argmax
DataArray.argmin
DataArray.idxmax
DataArray.idxmin
DataArray.max
DataArray.min
DataArray.mean
DataArray.median
DataArray.prod
DataArray.sum
DataArray.std
DataArray.var

core.coordinates.DataArrayCoordinates.get
core.coordinates.DataArrayCoordinates.items
core.coordinates.DataArrayCoordinates.keys
Expand All @@ -182,82 +56,13 @@
core.coordinates.DataArrayCoordinates.indexes
core.coordinates.DataArrayCoordinates.variables

core.rolling.DataArrayCoarsen.all
core.rolling.DataArrayCoarsen.any
core.rolling.DataArrayCoarsen.construct
core.rolling.DataArrayCoarsen.count
core.rolling.DataArrayCoarsen.max
core.rolling.DataArrayCoarsen.mean
core.rolling.DataArrayCoarsen.median
core.rolling.DataArrayCoarsen.min
core.rolling.DataArrayCoarsen.prod
core.rolling.DataArrayCoarsen.reduce
core.rolling.DataArrayCoarsen.std
core.rolling.DataArrayCoarsen.sum
core.rolling.DataArrayCoarsen.var
core.rolling.DataArrayCoarsen.boundary
core.rolling.DataArrayCoarsen.coord_func
core.rolling.DataArrayCoarsen.obj
core.rolling.DataArrayCoarsen.side
core.rolling.DataArrayCoarsen.trim_excess
core.rolling.DataArrayCoarsen.windows

core.groupby.DataArrayGroupBy.assign_coords
core.groupby.DataArrayGroupBy.first
core.groupby.DataArrayGroupBy.last
core.groupby.DataArrayGroupBy.fillna
core.groupby.DataArrayGroupBy.quantile
core.groupby.DataArrayGroupBy.where
core.groupby.DataArrayGroupBy.all
core.groupby.DataArrayGroupBy.any
core.groupby.DataArrayGroupBy.count
core.groupby.DataArrayGroupBy.max
core.groupby.DataArrayGroupBy.mean
core.groupby.DataArrayGroupBy.median
core.groupby.DataArrayGroupBy.min
core.groupby.DataArrayGroupBy.prod
core.groupby.DataArrayGroupBy.std
core.groupby.DataArrayGroupBy.sum
core.groupby.DataArrayGroupBy.var
core.groupby.DataArrayGroupBy.dims
core.groupby.DataArrayGroupBy.groups

core.resample.DataArrayResample.all
core.resample.DataArrayResample.any
core.resample.DataArrayResample.apply
core.resample.DataArrayResample.assign_coords
core.resample.DataArrayResample.bfill
core.resample.DataArrayResample.count
core.resample.DataArrayResample.ffill
core.resample.DataArrayResample.fillna
core.resample.DataArrayResample.first
core.resample.DataArrayResample.last
core.resample.DataArrayResample.map
core.resample.DataArrayResample.max
core.resample.DataArrayResample.mean
core.resample.DataArrayResample.median
core.resample.DataArrayResample.min
core.resample.DataArrayResample.prod
core.resample.DataArrayResample.quantile
core.resample.DataArrayResample.reduce
core.resample.DataArrayResample.std
core.resample.DataArrayResample.sum
core.resample.DataArrayResample.var
core.resample.DataArrayResample.where
core.resample.DataArrayResample.dims
core.resample.DataArrayResample.groups

core.rolling.DataArrayRolling.argmax
core.rolling.DataArrayRolling.argmin
core.rolling.DataArrayRolling.count
core.rolling.DataArrayRolling.max
core.rolling.DataArrayRolling.mean
core.rolling.DataArrayRolling.median
core.rolling.DataArrayRolling.min
core.rolling.DataArrayRolling.prod
core.rolling.DataArrayRolling.std
core.rolling.DataArrayRolling.sum
core.rolling.DataArrayRolling.var
core.rolling.DataArrayRolling.center
core.rolling.DataArrayRolling.dim
core.rolling.DataArrayRolling.min_periods
Expand All @@ -268,19 +73,6 @@
core.weighted.DataArrayWeighted.obj
core.weighted.DataArrayWeighted.weights

DataArray.argsort
DataArray.clip
DataArray.conj
DataArray.conjugate
DataArray.imag
DataArray.searchsorted
DataArray.round
DataArray.real
DataArray.T
DataArray.cumsum
DataArray.cumprod
DataArray.rank

core.accessor_dt.DatetimeAccessor.ceil
core.accessor_dt.DatetimeAccessor.floor
core.accessor_dt.DatetimeAccessor.round
Expand Down
Loading

0 comments on commit 5cabd19

Please sign in to comment.