You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using an iterator, so the complete dataset shouldn't be held in memory by badgerhold, however badger itself may be caching data in memory, which is a common tactic used by databases. Badger in general tends to use a lot more memory compared to BoltDB.
However, I believe I can make this count more efficient by using a key only iterator. I'll need to do some research.
Yeah, doing a key only iterator won't work, because the values are needed in order to filter by the query criteria.
I could disable pre-fetching values, but that would come at the cost of performance. The root of the issue is the fact that badger has decided (like most databases) to pay for more performance with memory usage.
Hi guys
When using
Count
in aDB
with a huge amount of entries the memory is being eaten up.And it is being freed right after
Count
.Cheers,
artvel
The text was updated successfully, but these errors were encountered: