How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
-
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend.
Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there?
Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time?
Any insight (or speculation) would be much appreciated.
-
It's pure speculation but my guess would be that the single page would be seen as one URL and that, the page would be seen indeed as slower to load everything. But in most cases the viewport/ what's in a users view would be loaded quite fast so because of that it could be seen as 'on average'.