You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Need fast ability to store millions of keyed/hashed items. Think git repo objects.
Proposed Solution
Retrieving: Given a hash like [u8;20], walk through the hash in standard iteration order byte-by-byte. Each byte is the key into a level. If that level is an inner level, iterate to the next byte and move down a level. If that level is a leaf level, loop through all 256 values to find the correct value. Compare the full hash to verify it's the correct item.
Adding: Same process as retrieving. If you hit a leaf and the bucket is already full, convert the leaf into a inner, and copy the values down one level further. It's splitting the leaf bucket into 256 sub-buckets.
Inspirations: sqlite file format, but less complex.
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
Is there an existing request for this?
Statement of Objective
Need fast ability to store millions of keyed/hashed items. Think git repo objects.
Proposed Solution
Retrieving: Given a hash like [u8;20], walk through the hash in standard iteration order byte-by-byte. Each byte is the key into a level. If that level is an inner level, iterate to the next byte and move down a level. If that level is a leaf level, loop through all 256 values to find the correct value. Compare the full hash to verify it's the correct item.
Adding: Same process as retrieving. If you hit a leaf and the bucket is already full, convert the leaf into a inner, and copy the values down one level further. It's splitting the leaf bucket into 256 sub-buckets.
Inspirations: sqlite file format, but less complex.
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: