You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Making your website fast is the top priority for Gatsby. A big part of that is when you click on an internal link, the next page should load immediately. To make this happen, the resources for that page (at least the JS bundles) need to be cached in the browser. For browsers that support service workers, we accomplish this by precaching bundles in the service worker cache so when you navigate and request the next page, it can be quickly retrieved by the sw (typically < 25ms).
But what to do about old IEs and the new IE Safari that don't yet have Service Worker support? And, given IOS Safari's glacial upgrade pace, might not have for years to come?
I've been pondering on this for a while and hadn't come up with a pattern I was really happy with. But the next.js recent launch blog post provided what I think is the right pattern.
For www.zeit.co we've implemented a technique on top of Next.js that brings us the best of both worlds: every single tag pre-fetches the component's JSON representation on the background, via a ServiceWorker.
If you navigate around, odds are that by the time you follow a link or trigger a route transition, the component has already been fetched.
If we create a custom Gatsby <Link> component, we can do the same thing. The Link component looks at what it's linking to and goes out and fetches its code and data resources. As you click around a site, you gradually download all the code bundles ensuring the site navigation experience is lightning fast.
This might also end up being the pattern we use for browsers that support service workers as our current practice of caching whole sites would be problematic for very large sites. So caching as you go could be a better plan in general while still making "cache everything" an option especially for sites that want to work offline immediately.
I've implemented an experimental version of this in a paid project. The GatsbyLink function looks like:
importReactfrom'react'importLinkfrom'react-router/lib/Link'classGatsbyLinkextendsReact.Component{componentDidMount(){// Only enable prefetching of Link resources in production and for browsers that// don't support service workers *cough* Safari/IE *cough*.if(process.env.NODE_ENV==='production'&&!('serviceWorker'innavigator)){constroutes=window.gatsbyRootRouteconst{ createMemoryHistory }=require('history')constmatchRoutes=require('react-router/lib/matchRoutes')constgetComponents=require('react-router/lib/getComponents')constcreateLocation=createMemoryHistory().createLocationif(typeofroutes!=='undefined'){matchRoutes([routes],createLocation(this.props.to),(error,nextState)=>{getComponents(nextState,()=>console.log('loaded assets for '+this.props.to))})}}}render(){return<Link{...this.props}/>}}module.exports=GatsbyLink
You can use this is the just released alpha9.
You'll notice I'm pulling the routes off the window. Which is a no-no of course. I did this because for some still mysterious reason, if I required the routes module from .intermediate-representation/split-child-routes, Webpack would slip into an infinite loop while building. After poking at it for an hour or so I gave up.
The goal is to ship this as the default Link component so this bug will need ironed out before then.
In the meantime, it's definitely proving a nice perf upgrade for non-sw browsers!
The text was updated successfully, but these errors were encountered:
Making your website fast is the top priority for Gatsby. A big part of that is when you click on an internal link, the next page should load immediately. To make this happen, the resources for that page (at least the JS bundles) need to be cached in the browser. For browsers that support service workers, we accomplish this by precaching bundles in the service worker cache so when you navigate and request the next page, it can be quickly retrieved by the sw (typically < 25ms).
But what to do about old IEs and the new IE Safari that don't yet have Service Worker support? And, given IOS Safari's glacial upgrade pace, might not have for years to come?
I've been pondering on this for a while and hadn't come up with a pattern I was really happy with. But the next.js recent launch blog post provided what I think is the right pattern.
If we create a custom Gatsby
<Link>
component, we can do the same thing. The Link component looks at what it's linking to and goes out and fetches its code and data resources. As you click around a site, you gradually download all the code bundles ensuring the site navigation experience is lightning fast.This might also end up being the pattern we use for browsers that support service workers as our current practice of caching whole sites would be problematic for very large sites. So caching as you go could be a better plan in general while still making "cache everything" an option especially for sites that want to work offline immediately.
I've implemented an experimental version of this in a paid project. The
GatsbyLink
function looks like:You can use this is the just released alpha9.
You'll notice I'm pulling the routes off the window. Which is a no-no of course. I did this because for some still mysterious reason, if I required the routes module from
.intermediate-representation/split-child-routes
, Webpack would slip into an infinite loop while building. After poking at it for an hour or so I gave up.The goal is to ship this as the default Link component so this bug will need ironed out before then.
In the meantime, it's definitely proving a nice perf upgrade for non-sw browsers!
The text was updated successfully, but these errors were encountered: