-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TTL must be slightly longer than the time it takes for the promise to resolve. #16
Comments
Hey @garand thanks for the feedback! Glad the library is helpful to you 👍 I'm afraid I don't fully understand what you're trying to achieve. Could you provide a list of calls with timestamps and the expected behavior vs the observed behavior? Heads up: I'm on vacation until January, I'll try to look for your answer within the next days but please be patient if it takes a little longer |
Hi @garand. I'm back from vacation. Is this still relevant? |
Yes! I can put together a clearer example later today if that helps. |
That would be great, thank you! |
Here is a reproduction with some comments to hopefully better illustrate what I'm seeing. https://stackblitz.com/edit/express-simple-8bmtng?file=index.js Please let me know if you need any more details! |
Hey @garand. thanks for the detailed answer now I understand! You should be able to observe that while the TTL is valid only one fresh value is requested. Given Your example with 5s DELAY and 3s TTL
Note how "Cache Resolved" always is pending since once Maybe think of it like the promised fresh value is cached, not the resolved one. That said, is there a good reason why you would want the TTL and the DELAY to be additive? |
closing this due to inactivity and since I consider this more a discussion then a bug in the library. Let me know how I can be of further assistance. |
hey @Xiphe, coming to this issue from a similar requirement. We compute all our ttl from the In react-query, we also only start the
The reason is mostly that you cannot control how long a request / function call takes. Maybe there are retries with exponential backoff involved, so a request can take up to 30s. If we then want the result to be cached for 30s, once we get the result back that time will already be over. I'm curious why you think this is the better default behaviour ? |
Hi @TkDodo - I think most of this discussion had been covered in #94. In summary I think the default behavior of cachified has slightly different design goals compared to what you're after. That said, the behavior you're looking for (from what I understand) can be achieved with the current APIs: function getSomething(id: string) {
return cachified({
key: `something-${id}`,
cache,
async getFreshValue({ metadata }) {
const response = await fetch(
`https://jsonplaceholder.typicode.com/users/${id}`,
);
const data = await response.json();
metadata.createdTime = Date.now(); // 👈 2️⃣ Start ttl from now, not from when the initial request was made
metadata.ttl = getTTLFromCacheControlHeader(response); // 👈 3️⃣ set to -1 when no cache-control header is present
return data;
},
ttl: Infinity, // 👈 1️⃣ allow `getFreshValue` to take forever
});
} |
thanks @Xiphe, that's an interesting approach. I didn't know we could alter the One problem I'm seeing is that those requests that we don't want to cache will also wind up in the But I guess this is unavoidable, right ? One more question to the above approach: If I set Thank you 🙏 |
It's not well documented. I guess 🤔
That should™ not be the case.
I'm not 100% sure here... I think yes. The ttl of a previously cached value should be over at this point. An interesting edge-case that we might not currently consider is: What if we're in a stale while validate refresh, and then set |
Oh that's great. I'll try it out and confirm today. |
@Xiphe you are absolutely right - the cache is not hit if we set |
Thanks for the additional conversation here, that's helpful for me! For my uses cases, I'm wrapping the cachified function to add some defaults. Based on the above convo I updated it to this, which seems to do what I was initially talking about. Is there negative impacts of this that I'm not aware of? return cachified({
ttl: ms("1m"),
...options,
cache: lru,
key: keyPrefix + options.key,
async getFreshValue(context) {
const freshValue = await options.getFreshValue(context);
/**
* Set the created time for the cache entry to now,
* this allows the ttl to start from the time the
* the value was returned from the `getFreshValue` function.
*/
context.metadata.createdTime = Date.now();
return freshValue;
},
}); |
If anyone wants to contribute documentation on this point that is welcome! :) |
@garand nothing coming to mind. But it's strongly dependent on your use-case. |
In my use case I was trying to set
ttl: 0
andstaleWhileRevalidate: Infinity
so that every request after the initial request would always trigger a refetch in the background, however in my tests, if thettl
value is less than the time the promise takes, it will ALWAYS refetch.This is my example that always refetches.
And if I change the promise timeout to
2000
, it will always return the cached value and refetch in the background (after the ttl).And because of this I can't set the
ttl
to 0 to get my desired outcome.Let me know if you need any additional info, thanks for this library it's exactly what I've been looking for!
The text was updated successfully, but these errors were encountered: