Speed up test_sorting_s3_nwb_zarr #2767
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Speed up a very slow test, by only checking some units, cutting the total test run from 33->28 minutes!
A preview of the Great-Test-Speed-Up of 2024: SpikeInterface/SpikeInterface-Hackathon-Edinburgh-May24#8
(This seemed too easy and impactful to wait until the hackathon)
The function
test_sorting_s3_nwb_zarr
downloads a sorting object from s3 and checks some things about it. This has traditionally taken a while because the object is pretty big and has ~450 units. Most of the testing time was spent looping over these 450 units and calculating their spike trains. This is probably overkill, so I propose to only check a random selection of 3/450 units.You can see the overall speed-up here, from my github page (I realise these are failing, but they are attempting all the tests!!):
(from https://github.com/chrishalcrow/spikeinterface/actions )