You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The std::collections BTreeMap has a range method that allows for efficiently iterating an adjacent subset of elements between [min_key, max_key]. It would be awesome to have equivalent functionality in this crate where we also have the benefits of structural sharing.
A related request is a remove_range method that removes all the keys between [min key, max key]. I'm not sure if that could be implemented more efficiently in batch vs. just iterating a range and removing the keys one-by-one.
In the meantime, the workaround I see is to call split(min_key) and iterate the right hand side of the split until max_key. But this looks to iterate the entire map, which is prohibitively expensive if we're only interested in a small number of keys: https://docs.rs/im/11.0.1/src/im/ordmap.rs.html#977
The text was updated successfully, but these errors were encountered:
breckognize
changed the title
Iterate over range of keys
Efficiently iterate over range of keys between [min_key, max_key]
Jun 11, 2022
The std::collections BTreeMap has a range method that allows for efficiently iterating an adjacent subset of elements between [min_key, max_key]. It would be awesome to have equivalent functionality in this crate where we also have the benefits of structural sharing.
A related request is a remove_range method that removes all the keys between [min key, max key]. I'm not sure if that could be implemented more efficiently in batch vs. just iterating a range and removing the keys one-by-one.
In the meantime, the workaround I see is to call split(min_key) and iterate the right hand side of the split until max_key. But this looks to iterate the entire map, which is prohibitively expensive if we're only interested in a small number of keys:
https://docs.rs/im/11.0.1/src/im/ordmap.rs.html#977
The text was updated successfully, but these errors were encountered: