Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support caching for streamed responses #7

Open
predaytor opened this issue Sep 1, 2024 · 2 comments
Open

Support caching for streamed responses #7

predaytor opened this issue Sep 1, 2024 · 2 comments

Comments

@predaytor
Copy link

predaytor commented Sep 1, 2024

Stackblitz

When using defer to stream a response from a loader that returned a promise which later used in a boundary, we cannot properly serialize the promise to store say in localStorage or IndexedDB (JSON.stringify will result in an empty object {}). The only way to store this type of data is directly in memory, so on the next navigation, clientLoader will return fulfilled promise with the data, rather than calling the server `loader'.

let cache: SerializeFrom<typeof loader>;
export const clientLoader = defineClientLoader(async ({ serverLoader }) => {
    if (!cache) {
        cache = await serverLoader<typeof loader>();
    }

    return cache;
});

clientLoader.hydrate = true;

By using remix-client-cache, we can create an adapter for when we need to cache streaming responses, rather than relying on a global setting using, for example, localStorage.

import { lru } from 'tiny-lru';
import { type CacheAdapter, createCacheAdapter } from 'remix-client-cache';

const cache = lru(100);

class LRUAdapter implements CacheAdapter {
	async getItem(key: string) {
		return cache.get(key);
	}

	async setItem(key: string, value: any) {
		return cache.set(key, value);
	}

	async removeItem(key: string) {
		return cache.delete(key);
	}
}

export const { adapter: lruAdapter } = createCacheAdapter(() => new LRUAdapter());

routes/index.tsx:

import { defer } from '@remix-run/node';
import { Await, ClientLoaderFunctionArgs, Link } from '@remix-run/react';
import { Suspense } from 'react';
import { cacheClientLoader, useCachedLoaderData } from 'remix-client-cache';
import { lruAdapter } from '~/client-cache-adapter';

async function getQueryData() {
	await new Promise((resolve) => setTimeout(resolve, 3000));
	return { data: [{ id: 1 }] };
}

export async function loader() {
	const query = getQueryData();

	return defer({
		query,
	});
}

export const clientLoader = (args: ClientLoaderFunctionArgs) => cacheClientLoader(args, { adapter: lruAdapter });
clientLoader.hydrate = true;

export default function Page() {
	const { query } = useCachedLoaderData<typeof loader>();

	return (
		<>
			<Link to="/">Home</Link>

			<br />
			<br />

			<Suspense fallback="loading...">
				<Await resolve={query}>
					{({ data }) => (
						<ul>
							{data?.map((entry) => <li key={entry.id}>{entry.id}</li>)}
						</ul>
					)}
				</Await>
			</Suspense>
		</>
	);
}

This opens up a question whether it's a good idea to store data in server memory, it would be interesting if we could transform fulfilled promises using a library like turbo-stream to store on the client-side or use a web worker and decode to the original form for consumption by .


There is currently a bug where a returned promise from the cache has already been fulfilled, the internal logic of remix-client-cache cannot understand when a revalidation should occur, or fulfilled data is currently present, since we must store promises directly in memory and not as a string.
https://github.com/forge42dev/remix-client-cache/blob/main/src/index.tsx#L116-L140

// Unpack deferred data from the server
useEffect(() => {
	let isMounted = true;
	if (loaderData.deferredServerData) {
		loaderData.deferredServerData.then((newData: any) => {
			if (isMounted) {
				adapter.setItem(loaderData.key, newData);
				setFreshData(newData);
			}
		});
	}
	return () => {
		isMounted = false;
	};
}, [loaderData, adapter]);

// Update the cache if the data changes
useEffect(() => {
	if (
		loaderData.serverData &&
		JSON.stringify(loaderData.serverData) !== JSON.stringify(freshData)
	) {
		setFreshData(loaderData.serverData);
	}
}, [loaderData?.serverData, freshData]);
@predaytor
Copy link
Author

I was able to combine localforage and turbo-stream to decode and encode the data, but due to the nature of streaming the promises can take a long time to resolve and it would be wrong to wait for them until they reach the fulfilled status to be saved to IndexedDB.

I believe that we should not rely on streaming response client-side caching, as it is unreliable and unfit for purpose (especially for nested promises), and instead, we should only rely on server-side and http caching.

import localforage from 'localforage';
import { decode, encode } from 'turbo-stream';

function promiseToReadableStream(promise: Promise<Uint8Array | null>): ReadableStream<Uint8Array> {
	return new ReadableStream<Uint8Array>({
		async start(controller) {
			try {
				const chunk = await promise;
				if (chunk !== null) {
					controller.enqueue(chunk);
				}
				controller.close();
			} catch (error) {
				controller.error(error);
			}
		},
	});
}

export class LocalForageAdapter {
	async getItem(key: string) {
		const encoded = localforage.getItem<Uint8Array>(key);

		if (!(await encoded)) return undefined;

		const stream = promiseToReadableStream(encoded);
		const decoded = await decode(stream);
		const data = decoded.value;

		await decoded.done;
		return data;
	}

	async setItem(key: string, value: unknown) {
		const stream = encode(value);
		const reader = stream.getReader().read();
		const setter = localforage.setItem(key, (await reader).value);
		(await reader).done;
		return await setter;
	}

	async removeItem(key: string) {
		return localforage.removeItem(key);
	}
}
let cache = new LocalForageAdapter();
export async function clientLoader({ serverLoader, request }: ClientLoaderFunctionArgs) {
	const cacheKey = new URL(request.url).pathname;

	const cachedData = await cache.getItem(cacheKey);
	if (cachedData) return cachedData;

        // this will happen instantly (e.g. an empty pending promise will be saved)
	const serverData = await serverLoader();
	await cache.setItem(cacheKey, serverData);

	return serverData;
}
clientLoader.hydrate = true;

@AlemTuzlak
Copy link
Contributor

@predaytor I looked into this today and I'll try to figure out turbo-stream decoding/encoding and see if we could get it to work with nested promises etc properly with some great DX (0 setup on user side), I'm definitely going to look into it but not sure when and IF it will land, so no promises atm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants