Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong units when using da.integrate() #205

Closed
astoeriko opened this issue Feb 23, 2023 · 4 comments · Fixed by #163
Closed

Wrong units when using da.integrate() #205

astoeriko opened this issue Feb 23, 2023 · 4 comments · Fixed by #163

Comments

@astoeriko
Copy link

Xarray has the nice xarray.DataArray.integrate method to integrate a data array over one of its coordinates. I automatically accounts for the values of the coordinates. The integration results should have units of the original array times the units of the coordinates. However, if I integrate a unit-aware Dataarray, it does not change its units. Here is a small example:

import numpy as np
import xarray as xr
import pint_xarray

t = xr.IndexVariable("time", np.arange(20), attrs={"units": "s"})
velocity = xr.DataArray(
    np.random.rand(20),
    dims="time",
    coords={"time": t},
    attrs={"units": "m/s"}
).pint.quantify()

velocity.integrate("time")
xarray.DataArray

    9.440035502696144 m/s
    Coordinates: (0)
    Indexes: (0)
    Attributes: (0)

I guess the problem is that only the data array is unit-aware, but not its coordinate. If this is not an easy thing to fix,it would be good to strip the unit completely or at least warn the user that the unit may be incorrect.

@astoeriko
Copy link
Author

Oh, I just saw that you briefly mention integrate and the issues that currently exist in this blogpost.

This wrapping is currently necessary for any operation which needs to be aware of the units of a dimension coordinate of the dataarray, or any xarray operation which relies on an external library (such as calling scipy in .integrate).

Does that mean there is a workaround that allows the integrate method to work correctly? It is not so clear to me how I would have to do this.

@keewis
Copy link
Collaborator

keewis commented Feb 24, 2023

there's nothing like that yet, unfortunately. I completely missed the interaction with the coordinates when writing the tests for unit-aware operations in xarray's test suite, so of course the documentation page you're quoting does not mention it, either (and the explicit mention in the blog post is from @TomNicholas).

For now, the best we can do is to mention this in the page, and add a wrapper method to the accessor (so velocity.pint.integrate("time") would do the right thing).

@LecrisUT
Copy link

LecrisUT commented Dec 7, 2023

Note: this is more general, e.g. multiplying with a coordinate also does not propagate the dimensions, e.g. we should be able to do:

import xarray
import pint_xarray
import pint
import numpy

ureg = pint.UnitRegistry(system="atomic", force_ndarray_like=True)
unit = ureg.bohr

x = numpy.linspace(-5, 5, 501) * unit
alpha = 1 * unit ** -2
psi = (2 * alpha / numpy.pi) ** 0.25 * numpy.exp(-alpha * x ** 2)

da = xarray.DataArray(data=psi, coords=dict(x=x))

norm = (da.conj() * da).integrate("x")
assert norm == ureg.Quantity(1.0 * ureg.dimensionless)
center = (da.conj() * da["x"] * da).integrate("x")
assert center == ureg.Quantity(0.0 * unit)
spread = (da.conj() * da["x"] * da["x"] * da).integrate("x")
assert spread == ureg.Quantity(0.25 * unit ** 2)

@keewis
Copy link
Collaborator

keewis commented Dec 7, 2023

in both cases you're running into the issue that dimension coordinates cannot have units (at least until #162 is merged, which is close but not quite there yet).

In other words, da["x"] does not wrap a quantity (even though x is one), and thus the result of the integration does not take care of the units either.

@keewis keewis linked a pull request Dec 9, 2023 that will close this issue
5 tasks
@keewis keewis mentioned this issue Dec 9, 2023
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants