You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
less ugly station_name_unique method (duplicate station codes, what is what? >> we use station_name_unique to avoid overwriting) >> disabled
_make_hydrotools_consistent(ds) for all sources, run testbank to see if required attrs are present.
add more todo from P:\11210366-003-getijverandering\insitu_data\RWS_DDL\download_ddl.py and dfm_tools/observations.py
rename ddl to rws or rwsddl (no dash)
remove meta_dict from retrieve function and add exception to read_catalog function if anything was passed that could not be filtered. This cleans up code significantly.
optional (can also be done later): assert for presence of nc-file in retrieve test, will currently fail for some sources since requested time/station is not available. test_ssh_retrieve_data does not always download a file, assert for this and try different station if no file (maybe different time min/max)
rerun example notebook
move station loop and file saving (incl _make_hydrotools_consistent) to generic part to avoid duplication
add crs wgs84 to all catalog dataframes (or is that the default?), test this by applying .to_crs() to dataframe
waterlevels as float32 instead of float64
hydrotools-compatible function: add assert for time variable and dimension (both lowercase)
maybe clean cmems dataset if hydrotools cannot work with lat/lon/pos dim
download cmems raw file to tmpdir so it does not have to be deleted
fix failing tests/test_observations.py::test_ssh_retrieve_data[rwsddl] testcase on github, probably same cause as Can only query a limited date range from the API ddlpy#18, so can probably be fixed by sorting locations dataframe on numericid column in test (and possibly look for a new index, or get single row via station-code instead to be more future proof)
move adding station_name/lat/country/etc attrs to general function, requires these attrs to be present in the location catalog/df already, which is desireable. Assert for this.
update ddlpy dependency from git link to pypi package (>=0.3.0), make non-optional and remove import checks from observations.py (move import to top of script)
rerun notebook
The text was updated successfully, but these errors were encountered:
TODO:
subset_gpd.empty: continue
)_make_hydrotools_consistent(ds)
for all sources, run testbank to see if required attrs are present.test_ssh_retrieve_data
does not always download a file, assert for this and try different station if no file (maybe different time min/max)_make_hydrotools_consistent
) to generic part to avoid duplication.to_crs()
to dataframetests/test_observations.py::test_ssh_retrieve_data[rwsddl]
testcase on github, probably same cause as Can only query a limited date range from the API ddlpy#18, so can probably be fixed by sorting locations dataframe on numericid column in test (and possibly look for a new index, or get single row via station-code instead to be more future proof)rwsddl_ssh_get_time_max()
The text was updated successfully, but these errors were encountered: