Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Combined caching strategy #212

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Combined caching strategy #212

wants to merge 1 commit into from

Conversation

scjackson
Copy link
Contributor

@jordangarcia @loganlinn

This includes the changes for the cache abstraction Logan authored and the unwatch/config options I added. API changes include:

  1. Config option for cache. This expects a cache constructor that conforms to the interface specified in Logan's earlier work.
  2. Default to using an LRU cache with a max of 1000 items
  3. Config option for maxItemsToCache. If not specified, this will default to 1000. If a number > 0, the LRU cache will cache up to maxItemsToCache values. If non numeric or null, we will default to using a BasicCache with no item limit
  4. Config option for useCache that defaults to true. If false, no cache values will be set.
  5. Getter pseudo-constructor that can be used to add cache and cacheKey options for getters. If specified, cacheKey will be used as the getter cache key rather than a getter reference. Valid values for cache are ['always', 'default', 'never']. If 'always' or 'never' are specified, these options will be favored over the global useCache setting.
  6. When getters are unobserved, their cached values will be evicted. If getters are unobserved by handler, their corresponding getters caches will be evicted if there are no other handlers associated with the getter

@loganlinn
Copy link
Contributor

A goal of the new caching abstraction (#211) was to leave Getter and Reactor as unaware of the caching features as possible. I believe there is a lot of value in keeping Nuclear's API footprint as small as possible, while exposing powerful APIs that can apply to many use cases. There's less code to maintain and fewer concepts for newcomers to learn.

I imagined that these caching options to always/never cache globally or at Getter level would be implemented within a Cache instance as opposed to being directly integrated. For example, you could create a class that holds a mapping of Getters to their custom cache settings, like

var reactor = new Reactor({
  cache: new GranularCache({
    cache: "always",
    getters: Immutable.Map([
      [someGetter, "never"],  // using [k, v] tuple constructor because key is object
    ]),
  }),
})

Inside GranularCache, you would wrap a BasicCache (or even pass that in if you want it to compose) then lookup the Getter-specific settings in miss() to act accordingly.

I think the main changes that are here that we should keep are the caching improvements around removing observers.

How does that sound?

@scjackson
Copy link
Contributor Author

@loganlinn @jordangarcia up to you guys what you want to do with this. it'd be a shame to not get this in prod soon as resolves a good chunk of memory leaks we see on the optimizely account

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants