Skip to content

Commit b9507fe

Browse files
committed
Merge branch 'main' into custom-groupers
* main: Add download stats badges (pydata#9786) Fix open_mfdataset for list of fsspec files (pydata#9785) add 'User-Agent'-header to pooch.retrieve (pydata#9782) Optimize `ffill`, `bfill` with dask when `limit` is specified (pydata#9771)
2 parents f0f838c + d5f84dd commit b9507fe

File tree

8 files changed

+92
-37
lines changed

8 files changed

+92
-37
lines changed

README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,11 @@
44
[![Code coverage](https://codecov.io/gh/pydata/xarray/branch/main/graph/badge.svg?flag=unittests)](https://codecov.io/gh/pydata/xarray)
55
[![Docs](https://readthedocs.org/projects/xray/badge/?version=latest)](https://docs.xarray.dev/)
66
[![Benchmarked with asv](https://img.shields.io/badge/benchmarked%20by-asv-green.svg?style=flat)](https://pandas.pydata.org/speed/xarray/)
7-
[![Available on pypi](https://img.shields.io/pypi/v/xarray.svg)](https://pypi.python.org/pypi/xarray/)
87
[![Formatted with black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/python/black)
98
[![Checked with mypy](http://www.mypy-lang.org/static/mypy_badge.svg)](http://mypy-lang.org/)
9+
[![Available on pypi](https://img.shields.io/pypi/v/xarray.svg)](https://pypi.python.org/pypi/xarray/)
10+
[![PyPI - Downloads](https://img.shields.io/pypi/dm/xarray)](https://pypistats.org/packages/xarray)
11+
[![Conda - Downloads](https://img.shields.io/conda/dn/anaconda/xarray?label=conda%7Cdownloads)](https://anaconda.org/anaconda/xarray)
1012
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.11183201.svg)](https://doi.org/10.5281/zenodo.11183201)
1113
[![Examples on binder](https://img.shields.io/badge/launch-binder-579ACA.svg?logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAFkAAABZCAMAAABi1XidAAAB8lBMVEX///9XmsrmZYH1olJXmsr1olJXmsrmZYH1olJXmsr1olJXmsrmZYH1olL1olJXmsr1olJXmsrmZYH1olL1olJXmsrmZYH1olJXmsr1olL1olJXmsrmZYH1olL1olJXmsrmZYH1olL1olL0nFf1olJXmsrmZYH1olJXmsq8dZb1olJXmsrmZYH1olJXmspXmspXmsr1olL1olJXmsrmZYH1olJXmsr1olL1olJXmsrmZYH1olL1olLeaIVXmsrmZYH1olL1olL1olJXmsrmZYH1olLna31Xmsr1olJXmsr1olJXmsrmZYH1olLqoVr1olJXmsr1olJXmsrmZYH1olL1olKkfaPobXvviGabgadXmsqThKuofKHmZ4Dobnr1olJXmsr1olJXmspXmsr1olJXmsrfZ4TuhWn1olL1olJXmsqBi7X1olJXmspZmslbmMhbmsdemsVfl8ZgmsNim8Jpk8F0m7R4m7F5nLB6jbh7jbiDirOEibOGnKaMhq+PnaCVg6qWg6qegKaff6WhnpKofKGtnomxeZy3noG6dZi+n3vCcpPDcpPGn3bLb4/Mb47UbIrVa4rYoGjdaIbeaIXhoWHmZYHobXvpcHjqdHXreHLroVrsfG/uhGnuh2bwj2Hxk17yl1vzmljzm1j0nlX1olL3AJXWAAAAbXRSTlMAEBAQHx8gICAuLjAwMDw9PUBAQEpQUFBXV1hgYGBkcHBwcXl8gICAgoiIkJCQlJicnJ2goKCmqK+wsLC4usDAwMjP0NDQ1NbW3Nzg4ODi5+3v8PDw8/T09PX29vb39/f5+fr7+/z8/Pz9/v7+zczCxgAABC5JREFUeAHN1ul3k0UUBvCb1CTVpmpaitAGSLSpSuKCLWpbTKNJFGlcSMAFF63iUmRccNG6gLbuxkXU66JAUef/9LSpmXnyLr3T5AO/rzl5zj137p136BISy44fKJXuGN/d19PUfYeO67Znqtf2KH33Id1psXoFdW30sPZ1sMvs2D060AHqws4FHeJojLZqnw53cmfvg+XR8mC0OEjuxrXEkX5ydeVJLVIlV0e10PXk5k7dYeHu7Cj1j+49uKg7uLU61tGLw1lq27ugQYlclHC4bgv7VQ+TAyj5Zc/UjsPvs1sd5cWryWObtvWT2EPa4rtnWW3JkpjggEpbOsPr7F7EyNewtpBIslA7p43HCsnwooXTEc3UmPmCNn5lrqTJxy6nRmcavGZVt/3Da2pD5NHvsOHJCrdc1G2r3DITpU7yic7w/7Rxnjc0kt5GC4djiv2Sz3Fb2iEZg41/ddsFDoyuYrIkmFehz0HR2thPgQqMyQYb2OtB0WxsZ3BeG3+wpRb1vzl2UYBog8FfGhttFKjtAclnZYrRo9ryG9uG/FZQU4AEg8ZE9LjGMzTmqKXPLnlWVnIlQQTvxJf8ip7VgjZjyVPrjw1te5otM7RmP7xm+sK2Gv9I8Gi++BRbEkR9EBw8zRUcKxwp73xkaLiqQb+kGduJTNHG72zcW9LoJgqQxpP3/Tj//c3yB0tqzaml05/+orHLksVO+95kX7/7qgJvnjlrfr2Ggsyx0eoy9uPzN5SPd86aXggOsEKW2Prz7du3VID3/tzs/sSRs2w7ovVHKtjrX2pd7ZMlTxAYfBAL9jiDwfLkq55Tm7ifhMlTGPyCAs7RFRhn47JnlcB9RM5T97ASuZXIcVNuUDIndpDbdsfrqsOppeXl5Y+XVKdjFCTh+zGaVuj0d9zy05PPK3QzBamxdwtTCrzyg/2Rvf2EstUjordGwa/kx9mSJLr8mLLtCW8HHGJc2R5hS219IiF6PnTusOqcMl57gm0Z8kanKMAQg0qSyuZfn7zItsbGyO9QlnxY0eCuD1XL2ys/MsrQhltE7Ug0uFOzufJFE2PxBo/YAx8XPPdDwWN0MrDRYIZF0mSMKCNHgaIVFoBbNoLJ7tEQDKxGF0kcLQimojCZopv0OkNOyWCCg9XMVAi7ARJzQdM2QUh0gmBozjc3Skg6dSBRqDGYSUOu66Zg+I2fNZs/M3/f/Grl/XnyF1Gw3VKCez0PN5IUfFLqvgUN4C0qNqYs5YhPL+aVZYDE4IpUk57oSFnJm4FyCqqOE0jhY2SMyLFoo56zyo6becOS5UVDdj7Vih0zp+tcMhwRpBeLyqtIjlJKAIZSbI8SGSF3k0pA3mR5tHuwPFoa7N7reoq2bqCsAk1HqCu5uvI1n6JuRXI+S1Mco54YmYTwcn6Aeic+kssXi8XpXC4V3t7/ADuTNKaQJdScAAAAAElFTkSuQmCC)](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/weather-data.ipynb)
1214
[![Twitter](https://img.shields.io/twitter/follow/xarray_dev?style=social)](https://twitter.com/xarray_dev)

doc/whats-new.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,10 @@ New Features
2929
- Support lazy grouping by dask arrays, and allow specifying ordered groups with ``UniqueGrouper(labels=["a", "b", "c"])``
3030
(:issue:`2852`, :issue:`757`).
3131
By `Deepak Cherian <https://github.com/dcherian>`_.
32+
- Optimize ffill, bfill with dask when limit is specified
33+
(:pull:`9771`).
34+
By `Joseph Nowak <https://github.com/josephnowak>`_, and
35+
`Patrick Hoefler <https://github.com/phofl>`.
3236
- Allow wrapping ``np.ndarray`` subclasses, e.g. ``astropy.units.Quantity`` (:issue:`9704`, :pull:`9760`).
3337
By `Sam Levang <https://github.com/slevang>`_ and `Tien Vo <https://github.com/tien-vo>`_.
3438
- Optimize :py:meth:`DataArray.polyfit` and :py:meth:`Dataset.polyfit` with dask, when used with
@@ -60,6 +64,8 @@ Bug fixes
6064
By `Pascal Bourgault <https://github.com/aulemahal>`_.
6165
- Fix CF decoding of ``grid_mapping`` to allow all possible formats, add tests (:issue:`9761`, :pull:`9765`).
6266
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
67+
- Add `User-Agent` to request-headers when retrieving tutorial data (:issue:`9774`, :pull:`9782`)
68+
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
6369

6470
Documentation
6571
~~~~~~~~~~~~~

xarray/backends/common.py

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -136,14 +136,15 @@ def _find_absolute_paths(
136136
def _normalize_path_list(
137137
lpaths: NestedSequence[str | os.PathLike],
138138
) -> NestedSequence[str]:
139-
return [
140-
(
141-
_normalize_path(p)
142-
if isinstance(p, str | os.PathLike)
143-
else _normalize_path_list(p)
144-
)
145-
for p in lpaths
146-
]
139+
paths = []
140+
for p in lpaths:
141+
if isinstance(p, str | os.PathLike):
142+
paths.append(_normalize_path(p))
143+
elif isinstance(p, list):
144+
paths.append(_normalize_path_list(p)) # type: ignore[arg-type]
145+
else:
146+
paths.append(p) # type: ignore[arg-type]
147+
return paths
147148

148149
return _normalize_path_list(paths)
149150

xarray/core/dask_array_ops.py

Lines changed: 52 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -75,41 +75,71 @@ def least_squares(lhs, rhs, rcond=None, skipna=False):
7575
return coeffs, residuals
7676

7777

78-
def push(array, n, axis):
78+
def push(array, n, axis, method="blelloch"):
7979
"""
8080
Dask-aware bottleneck.push
8181
"""
8282
import dask.array as da
8383
import numpy as np
8484

8585
from xarray.core.duck_array_ops import _push
86+
from xarray.core.nputils import nanlast
87+
88+
if n is not None and all(n <= size for size in array.chunks[axis]):
89+
return array.map_overlap(_push, depth={axis: (n, 0)}, n=n, axis=axis)
90+
91+
# TODO: Replace all this function
92+
# once https://github.com/pydata/xarray/issues/9229 being implemented
8693

8794
def _fill_with_last_one(a, b):
88-
# cumreduction apply the push func over all the blocks first so, the only missing part is filling
89-
# the missing values using the last data of the previous chunk
90-
return np.where(~np.isnan(b), b, a)
95+
# cumreduction apply the push func over all the blocks first so,
96+
# the only missing part is filling the missing values using the
97+
# last data of the previous chunk
98+
return np.where(np.isnan(b), a, b)
9199

92-
if n is not None and 0 < n < array.shape[axis] - 1:
93-
arange = da.broadcast_to(
94-
da.arange(
95-
array.shape[axis], chunks=array.chunks[axis], dtype=array.dtype
96-
).reshape(
97-
tuple(size if i == axis else 1 for i, size in enumerate(array.shape))
98-
),
99-
array.shape,
100-
array.chunks,
101-
)
102-
valid_arange = da.where(da.notnull(array), arange, np.nan)
103-
valid_limits = (arange - push(valid_arange, None, axis)) <= n
104-
# omit the forward fill that violate the limit
105-
return da.where(valid_limits, push(array, None, axis), np.nan)
106-
107-
# The method parameter makes that the tests for python 3.7 fails.
108-
return da.reductions.cumreduction(
109-
func=_push,
100+
def _dtype_push(a, axis, dtype=None):
101+
# Not sure why the blelloch algorithm force to receive a dtype
102+
return _push(a, axis=axis)
103+
104+
pushed_array = da.reductions.cumreduction(
105+
func=_dtype_push,
110106
binop=_fill_with_last_one,
111107
ident=np.nan,
112108
x=array,
113109
axis=axis,
114110
dtype=array.dtype,
111+
method=method,
112+
preop=nanlast,
115113
)
114+
115+
if n is not None and 0 < n < array.shape[axis] - 1:
116+
117+
def _reset_cumsum(a, axis, dtype=None):
118+
cumsum = np.cumsum(a, axis=axis)
119+
reset_points = np.maximum.accumulate(np.where(a == 0, cumsum, 0), axis=axis)
120+
return cumsum - reset_points
121+
122+
def _last_reset_cumsum(a, axis, keepdims=None):
123+
# Take the last cumulative sum taking into account the reset
124+
# This is useful for blelloch method
125+
return np.take(_reset_cumsum(a, axis=axis), axis=axis, indices=[-1])
126+
127+
def _combine_reset_cumsum(a, b):
128+
# It is going to sum the previous result until the first
129+
# non nan value
130+
bitmask = np.cumprod(b != 0, axis=axis)
131+
return np.where(bitmask, b + a, b)
132+
133+
valid_positions = da.reductions.cumreduction(
134+
func=_reset_cumsum,
135+
binop=_combine_reset_cumsum,
136+
ident=0,
137+
x=da.isnan(array, dtype=int),
138+
axis=axis,
139+
dtype=int,
140+
method=method,
141+
preop=_last_reset_cumsum,
142+
)
143+
pushed_array = da.where(valid_positions <= n, pushed_array, np.nan)
144+
145+
return pushed_array

xarray/core/duck_array_ops.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -716,6 +716,7 @@ def first(values, axis, skipna=None):
716716
return chunked_nanfirst(values, axis)
717717
else:
718718
return nputils.nanfirst(values, axis)
719+
719720
return take(values, 0, axis=axis)
720721

721722

@@ -729,6 +730,7 @@ def last(values, axis, skipna=None):
729730
return chunked_nanlast(values, axis)
730731
else:
731732
return nputils.nanlast(values, axis)
733+
732734
return take(values, -1, axis=axis)
733735

734736

@@ -769,14 +771,14 @@ def _push(array, n: int | None = None, axis: int = -1):
769771
return bn.push(array, limit, axis)
770772

771773

772-
def push(array, n, axis):
774+
def push(array, n, axis, method="blelloch"):
773775
if not OPTIONS["use_bottleneck"] and not OPTIONS["use_numbagg"]:
774776
raise RuntimeError(
775777
"ffill & bfill requires bottleneck or numbagg to be enabled."
776778
" Call `xr.set_options(use_bottleneck=True)` or `xr.set_options(use_numbagg=True)` to enable one."
777779
)
778780
if is_duck_dask_array(array):
779-
return dask_array_ops.push(array, n, axis)
781+
return dask_array_ops.push(array, n, axis, method=method)
780782
else:
781783
return _push(array, n, axis)
782784

xarray/tests/test_backends.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5511,6 +5511,8 @@ def test_source_encoding_always_present_with_fsspec() -> None:
55115511
fs = fsspec.filesystem("file")
55125512
with fs.open(tmp) as f, open_dataset(f) as ds:
55135513
assert ds.encoding["source"] == tmp
5514+
with fs.open(tmp) as f, open_mfdataset([f]) as ds:
5515+
assert "foo" in ds
55145516

55155517

55165518
def _assert_no_dates_out_of_range_warning(record):

xarray/tests/test_duck_array_ops.py

Lines changed: 9 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1008,7 +1008,8 @@ def test_least_squares(use_dask, skipna):
10081008

10091009
@requires_dask
10101010
@requires_bottleneck
1011-
def test_push_dask():
1011+
@pytest.mark.parametrize("method", ["sequential", "blelloch"])
1012+
def test_push_dask(method):
10121013
import bottleneck
10131014
import dask.array
10141015

@@ -1018,13 +1019,18 @@ def test_push_dask():
10181019
expected = bottleneck.push(array, axis=0, n=n)
10191020
for c in range(1, 11):
10201021
with raise_if_dask_computes():
1021-
actual = push(dask.array.from_array(array, chunks=c), axis=0, n=n)
1022+
actual = push(
1023+
dask.array.from_array(array, chunks=c), axis=0, n=n, method=method
1024+
)
10221025
np.testing.assert_equal(actual, expected)
10231026

10241027
# some chunks of size-1 with NaN
10251028
with raise_if_dask_computes():
10261029
actual = push(
1027-
dask.array.from_array(array, chunks=(1, 2, 3, 2, 2, 1, 1)), axis=0, n=n
1030+
dask.array.from_array(array, chunks=(1, 2, 3, 2, 2, 1, 1)),
1031+
axis=0,
1032+
n=n,
1033+
method=method,
10281034
)
10291035
np.testing.assert_equal(actual, expected)
10301036

xarray/tutorial.py

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010

1111
import os
1212
import pathlib
13+
import sys
1314
from typing import TYPE_CHECKING
1415

1516
import numpy as np
@@ -157,8 +158,13 @@ def open_dataset(
157158

158159
url = f"{base_url}/raw/{version}/{path.name}"
159160

161+
headers = {"User-Agent": f"xarray {sys.modules['xarray'].__version__}"}
162+
downloader = pooch.HTTPDownloader(headers=headers)
163+
160164
# retrieve the file
161-
filepath = pooch.retrieve(url=url, known_hash=None, path=cache_dir)
165+
filepath = pooch.retrieve(
166+
url=url, known_hash=None, path=cache_dir, downloader=downloader
167+
)
162168
ds = _open_dataset(filepath, engine=engine, **kws)
163169
if not cache:
164170
ds = ds.load()

0 commit comments

Comments
 (0)