Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Combine SSR and RSC responses + Request Provider #390

Merged
merged 15 commits into from
Jan 12, 2022

Conversation

frandiox
Copy link
Contributor

@frandiox frandiox commented Dec 21, 2021

Description

Attempt to implement #250

This PR is an exploration on how to implement Server-only context that is supported by the official RSC and how to combine SSR+RSC responses into 1 (for performance). Feedback is very welcome.

This is using the official RSC just like in #317 and adding a few extra features:

  1. It adds a request.context object where we can store everything related to the current request.
  2. In SSR, it wraps the React tree in a Context/Provider (context is supported in SSR) in order to provide the current request object (and request.context) to the React tree.
  3. In RSC, it uses React.unstable_getCacheType to create some kind of cache that is scoped to the current RSC rendering (i.e. it doesn't conflict with other requests). In this cache, it stores the current request so it can be retrieved from anywhere in the React tree (just like a context provider, but works in RSC).
  4. Adds an internal useServerRequest hook that gets the current request anywhere in the tree (only callable from server components).

So far, we have Server-only Context to store data that is scoped to the current request, both in SSR and RSC, and have it available anywhere in the React tree. Examples of things to store here are: fetch promises (just like RenderCacheProvider in main branch); logger that shows the current request data; etc.

Finally:

  1. It runs SSR and RSC concurrently for the same request with a shared request.context.cache for fetch promises. This means it generates both responses at the same time but without repeating subrequests. When the SSR is done, it attaches the RSC response to the HTML and uses it in the browser instead of making a new RSC request.

Things not implemented

  • This only works in Node right now to validate this approach, so no workers for now.
  • The RSC response is currently inlined at the end of the HTML. It should also be possible to stream chunks as they come by wrapping them in script tags:
response.write(`<script>window.__flight.push(${JSON.stringify(rscChunk)})</script>`)

Every rscChunk is one line of the flight response so it can be consumed in the browser to prefetch client components asap.

Additional Context

Follow #317 indications to try this locally.
I'm not 100% if this all works well yet for bigger apps (it should?). There is a problem when having 2 tabs open (2 requests at the same time) related to the SSR context/provider (the context gets lost), but the RSC side works well. Not sure if I'm using Context/Provider correctly.

Comment on lines 99 to 100
"react-client": "link:../../../../react/build/node_modules/react-client",
"react-server": "link:../../../../react/build/node_modules/react-server",
"react-client": "file:../../../../react/build/node_modules/react-client",
"react-server": "file:../../../../react/build/node_modules/react-server",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is actually important because having symlinks means that we end up using 2 different versions of React. This makes everything fail because React dispatcher gets confused.

Comment on lines 117 to 118
const {PassThrough} = await import('stream');
const writer = new PassThrough();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For some reason, the HydrationWriter we use in main branch doesn't get the whole RSC response here, only part of it. PassThrough or a real ServerResponse gets all the RSC lines...

return cached;
}

console.log('---FETCHING', key);
Copy link
Contributor Author

@frandiox frandiox Dec 21, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Notice that this is only logged once per request (per key) since both SSR and RSC use the same cache.

Copy link
Contributor

@blittle blittle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love this idea. unstable_getCacheForType seems like just what we need. We should reach out to the FB workgroup about using this for RSC.

@frandiox
Copy link
Contributor Author

frandiox commented Jan 4, 2022

As mentioned in the OP, there's an issue with the SSR part where the context of ServerRequestProvider is lost after a while when 2 requests happen at the same time. To reproduce, open 2 tabs and restart the server, or directly run this in the browser console:

function doRequest() { return fetch('http://localhost:3000', {headers: {accept:'text/html'}}) }
await Promise.all([doRequest(), doRequest()])

I've been debugging this but I can't find why it happens. The same strategy currently works in main branch. It keeps happening even after commenting out all the RSC part.

@wizardlyhel Do you see any difference between your RenderCacheProvider and this ServerRequestProvider? 🤔

Comment on lines 291 to 293
const requestCache = React.unstable_getCacheForType(
requestHydrationCache
);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is where we should be using unstable_getCacheForType .. maybe inside useServerRequest is where it should be.

Any cache instance that is created in the entry file is going to be singular instance to the life of this worker. Since the entry file is managing multiple requests, when there are 2 requests under the same key happening around the same time, the latter request will clear the cache of the previous cache.

I think what you want to achieve is 1 cache instance per rendering request and only lives for the duration of the rendering request, is that right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As far as I understand, the cache used in unstable_getCacheForType gets instantiated once when rscRenderToPipeableStream is called.
Since each request calls this function separately once, I think we end up with different caches per request even within the same worker. Am I wrong? 🤔

Assuming this works, then we place the request object in this cache so it can be retrieved later in other parts of the tree.
The same request object is passed later to the SSR rendering as Context/Provider (this is where I find issues with multiple requests at the same time).

Copy link
Collaborator

@wizardlyhel wizardlyhel Jan 5, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From the sounds of it, it doesn't looks like unstable_getCacheForType is behaving what we think should be happening based on the issue you described.

My exploration of RenderCache stuff starts with an empty {} and gets filled by the end of a render.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wizardlyhel But the problem occurs in the SSR rendering where unstable_getCacheForType is not used. In fact, I was testing and this same problem happens in main branch with RenderCacheProvider.
The context gets reset to the initial value but in RenderCacheProvider that's an empty cache, so it doesn't throw but probably returns undefined values. If you initialize that context to null you will start seeing No RenderCache Context found errors. Not that this only happens when rendering 2 requests at the same time:

function doRequest() { return fetch('http://localhost:3000', {headers: {accept:'text/html'}}) }
await Promise.all([doRequest(), doRequest()])

So it looks like this is a bigger issue that what I thought, and it's already in main branch :(
Will report it properly tomorrow in a new issue.

@wizardlyhel
Copy link
Collaborator

This is super cool by the way - Just got it running on my local machine and looking on that double request issue

@wizardlyhel
Copy link
Collaborator

wizardlyhel commented Jan 6, 2022

This is very interesting - the RSC flight injection is happening before the body streams.

I'm gonna play around a bit and see how this behaves by putting a couple suspense boundaries.
Wondering if the RSC inject will delay FCP or we can ensure the stream blocks gets executed first with proper suspense boundaries.

Screen Shot 2022-01-06 at 2 34 30 PM

Screen Shot 2022-01-06 at 2 38 13 PM

@wizardlyhel
Copy link
Collaborator

wizardlyhel commented Jan 6, 2022

This is awesome! The main html body contains the static html when more suspense boundaries are placed. With this, we don't really need to worry about which (RSC or stream blocks) get injected first.

Here is the change I did in App.server.js

    <Suspense>
      <div className="p-5">
        <div>hello!</div>
        <C1 />
        <C2 myProp2={true} />
        <Suspense>
          <CShared myPropShared={true} />
          <CAsync1 />
        </Suspense>
      </div>
    </Suspense>

Screen Shot 2022-01-06 at 3 45 00 PM

@frandiox
Copy link
Contributor Author

frandiox commented Jan 7, 2022

@wizardlyhel Thanks a lot for testing it! Glad it works 😅

@frandiox frandiox merged commit 2548619 into fd-vite-rsc Jan 12, 2022
@frandiox frandiox deleted the fd-vite-rsc-request-provider branch January 12, 2022 19:48
@frandiox frandiox mentioned this pull request Jan 19, 2022
4 tasks
rafaelstz pushed a commit to rafaelstz/hydrogen that referenced this pull request Mar 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants