Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds LRU cache resize #45

Closed
wants to merge 1 commit into from
Closed

Conversation

kevburnsjr
Copy link
Contributor

Adds a Resize method to LRU cache.


We use golang-lru in production at Crunchyroll for embedded HTTP response caching in microservices through microcache.

We'd like to wrap LRU cache with our own type which would allow for the cache size to be specified in total data size rather than element count. This would make the cache safer and more memory efficient since right now we have to make an educated guess at average element size (compressed HTTP responses) and reserve significant memory overhead for potential variation. I have a good idea of how to implement this wrapper by calculating average element size in real time and adjusting the cache size dynamically to strictly control memory usage. However, this requires the cache to be resizable so that it can be expanded or contracted depending on the average size of elements added and evicted.

This is an interface change adding one method.

This PR only affects LRU and not ARC or 2Q since those implementations would be more complex.

The performance of a cache resize depends on the delta. In general, downsizing is more expensive since it may result in many evictions while upsizing is relatively cheap.

@yonderblue
Copy link

Sounds like what you want is an optional interface like Sizer() int that a key+value can implement, that is then used to determine total size.

@kevburnsjr
Copy link
Contributor Author

@Gaillard Yes. I can implement that interface within my own package. I need this resize method in the base package in order to dynamically tune the cache size in response.

@yonderblue
Copy link

But do you need to continually adjust the whole cache size if each element could effect the used amount by more than just a count using an element Size()?

@kevburnsjr
Copy link
Contributor Author

Okay, I don't see this feature getting any attention so I'll close the PR.
I created a different repo that adds the feature plus tagged cache invalidation.
https://github.com/KevBurnsJr/tci-lru

@kevburnsjr kevburnsjr closed this May 1, 2019
@kevburnsjr kevburnsjr deleted the resize branch May 1, 2019 22:47
@jefferai
Copy link
Member

jefferai commented May 6, 2019

@kevburnsjr any chance you can reopen this so that it can be merged?

@kevburnsjr kevburnsjr restored the resize branch June 21, 2019 19:19
@kevburnsjr kevburnsjr mentioned this pull request Jun 21, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants