You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is an issue in the estimate_place_fields function in spiking_likelihood_glm.py when using large arrays of spiking data (>2GB). The function fails, and looking through the callstack shows a ValueError: xxx exceeds max_bin_length(2147483647). This error stems from the line results = dask.compute(*results) and is documented in the dask repository several times (for example here and here).
Current Behavior
When using large spike arrays (>2GB), the function fails with the aforementioned ValueError.
Expected Behavior
The function should be able to handle large spike arrays without throwing an error.
Proposed Solution
A simple fix for this would be to call client.scatter on the spikes array before passing it to the dask compute cluster.
Steps to Reproduce
Use the estimate_place_fields function with a spike array larger than 2GB. Observe the ValueError being thrown.
The text was updated successfully, but these errors were encountered:
Description
There is an issue in the estimate_place_fields function in spiking_likelihood_glm.py when using large arrays of spiking data (>2GB). The function fails, and looking through the callstack shows a
ValueError: xxx exceeds max_bin_length(2147483647)
. This error stems from the lineresults = dask.compute(*results)
and is documented in the dask repository several times (for example here and here).Current Behavior
When using large spike arrays (>2GB), the function fails with the aforementioned ValueError.
Expected Behavior
The function should be able to handle large spike arrays without throwing an error.
Proposed Solution
A simple fix for this would be to call client.scatter on the spikes array before passing it to the dask compute cluster.
Steps to Reproduce
Use the estimate_place_fields function with a spike array larger than 2GB. Observe the ValueError being thrown.
The text was updated successfully, but these errors were encountered: