โ€ข 9 min read

How Google Crawls a JavaScript Site: The Rendering Pipeline

javascript seo rendering technical seo core web vitals

JavaScript frameworks power most modern sites, but Google does not crawl and render JavaScript the same way a browser does. Understanding the rendering pipeline is essential for any site using React, Vue, Next.js, Nuxt, Angular, or similar.

How Googlebot fetches a page

Googlebot fetches pages in two passes:

  1. First pass, HTML-only crawl. Googlebot fetches the HTML response and stores it immediately. Content in raw HTML is indexed right away. JS files are queued but not yet executed.
  2. Second pass, rendering. A headless Chromium instance (the "Web Rendering Service" or WRS) executes JS, evaluates the DOM, and indexes the rendered output. This can be delayed by hours to days depending on crawl budget and rendering queue depth.

Critical implication: content that only appears after JS executes may take days to be indexed, or never, if the page consumes too much crawl budget.

SSR, CSR, SSG, what Google actually sees

Strategy First-pass HTML SEO risk
CSR (client-side only) <div id="root"></div>: empty shell High, all content delayed to render pass
SSR (server-side render) Full HTML with content Low, content in first pass
SSG (static generation) Full HTML pre-built at deploy Lowest, fastest crawl + index
ISR (incremental static regen) Full HTML (stale-while-revalidate) Low, depends on revalidation interval

Common JS SEO mistakes

1. Critical metadata only in JS

Title tags, meta descriptions, canonical tags, and OG tags set only via document.title or a client-side head manager may not appear in first-pass HTML. Always render these in the server response.

<!-- BAD: set by JS only -->
<title>Loading...</title>

<!-- GOOD: SSR sets correct title in HTTP response -->
<title>Best Running Shoes 2026 | ShoeStore</title>

2. Lazy-loaded content not deferred properly

Content loaded via IntersectionObserver or user interaction won't be triggered by Googlebot. Important content must be in initial render; only supplementary elements (infinite scroll, tabs) should be lazy.

3. Blocking JS in <head>

Large synchronous scripts in <head> delay first-byte and FCP - both signals WRS observes. Use defer or async on non-critical scripts, and split your bundle.

4. Soft 404s from client-side routing

SPAs often return HTTP 200 for "not found" routes and handle 404 in JS. Googlebot sees HTTP 200 and indexes the empty error page. Fix: return proper HTTP 404 status codes from your server or CDN for invalid routes.

5. Internal links in JS event handlers

Links that only exist as onclick handlers or JS navigation calls are not followed by Googlebot during the HTML crawl pass. Use real <a href> tags for all SEO-critical navigation.

Testing what Google sees

  • Google Search Console โ†’ URL Inspection โ†’ View Crawled Page - shows the rendered HTML Google actually indexed.
  • curl -A googlebot https://yoursite.com/page: rough proxy for first-pass HTML (no JS execution).
  • Puppeteer / Playwright: run headless Chromium and compare DOM output to raw HTML.
  • AuditAI: checks OG tags, title, canonical, hreflang, and schema presence in the raw HTTP response, flagging JS-only metadata.

Framework-specific tips

  • Next.js: prefer getServerSideProps or getStaticProps; avoid pure useEffect data fetch for SEO-critical content. Use <Head> (next/head), SSR-safe.
  • Nuxt 3: useHead() is SSR-safe. Enable ssr: true (default) or use nuxt generate for SSG.
  • Angular Universal: SSR is opt-in. Verify Transfer State to avoid double API calls after hydration.
  • Gatsby: fully SSG by default, all pages pre-built, best case for crawlability.

Crawl budget and JS rendering

Large JS-heavy sites consume crawl budget twice: once for the HTML fetch, once for rendering. Pages with very low crawl priority may never reach the render queue. Strategies:

  • noindex pages you don't need indexed (param variants, admin).
  • Consolidate thin paginated pages via canonical to the root category.
  • Keep your sitemap tight, only submit URLs that should be indexed.
  • Speed up Time-to-First-Byte; Googlebot times out rendering after ~5 seconds.

Related: Technical SEO Checklist ยท Core Web Vitals Guide ยท Page Speed 80/20 Wins

Check how Google sees your site with AuditAI โ†’

Ready to audit your site?

Run an AI-powered SEO audit in under 30 seconds. Free, no signup required.

Run a free audit โ†’