You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 10, 2018. It is now read-only.
This stackoverflow answer seems to say that numpy arrays of dtype='object' will behave badly when trying to store anything which has a __len__ attribute.
Unfortunately the method proposed on the xarray issue above seems way more complicated than necessary for my purposes.
One (hacky) solution might be to use numpy to store an object whose only purpose is to contain a single xarray.Dataset but without exposing a __len__ or __getitem__ attribute, something like
On the other hand if the only numpy functionality I'm using is np.apply_along_axis and np.ndenumerate then it might be easier to just use a list of lists, and recursively call a function (like this one) which concatenates the objects residing in the deepest lists.
Unsure of what the best structure to hold the datasets to be concatenated is.
I tried a numpy.ndarray but that did not play well with xarray, as discussed at pydata/xarray#2159.
Might have to resort to using a list of lists?
Whatever structure I use should be something that is easy for a user to create if necessary.
The text was updated successfully, but these errors were encountered: