LeadsuiteNow
Technical SEO

JavaScript SEO: How to Make JS-Heavy Sites Crawlable

December 1, 202611 min read
Technical SEOJavaScript SEOReactNext.jsCrawlability

JavaScript has fundamentally changed how websites are built — and it has created one of the most complex challenges in modern technical SEO. When a site relies on JavaScript to render its content, product listings, navigation, or metadata, there is a real risk that search engines cannot access any of it. Google can execute JavaScript, but it does so in a two-wave process that introduces delays, has resource limits, and can fail silently. Meanwhile, Bing, DuckDuckGo, and most other search engines have significantly weaker JavaScript rendering capabilities. This guide covers how JavaScript affects crawling and indexing, the critical differences between rendering strategies (CSR, SSR, SSG, ISR), how to diagnose JavaScript SEO problems, and the specific fixes for React, Next.js, Angular, and Vue applications.

How Google Crawls and Renders JavaScript: The Two-Wave Process

Google's JavaScript processing happens in two waves, and understanding this is fundamental to diagnosing JS SEO problems. In wave one, Googlebot visits a URL and downloads the raw HTML. If the raw HTML contains all the necessary content and metadata, Google can index the page immediately. This is how traditional server-rendered sites work. For JavaScript-rendered sites, the raw HTML is often an empty shell — a div with id='app' and a script tag. In wave one, Googlebot records this essentially empty page. In wave two, which can occur anywhere from hours to weeks later, Google's Web Rendering Service (WRS) executes the JavaScript, renders the page, and processes the now-visible content. This delay is significant: content that exists in your app but only appears after JavaScript execution may not be indexed for days or weeks. Furthermore, Google's rendering infrastructure uses an older version of Chrome (currently Chrome 112 as of early 2026), which means modern JavaScript features not yet supported by that version may fail to execute. Google has a finite rendering budget — extremely JavaScript-heavy pages may have their rendering terminated before completion, resulting in partially indexed content.

  • Wave 1: HTML-only crawl — if content is in raw HTML, indexing is immediate
  • Wave 2: JavaScript rendering — can be delayed hours to weeks after initial crawl
  • Google's WRS uses Chrome 112 — unsupported modern JS features may fail silently
  • Rendering budget limits: extremely heavy JS pages may be partially rendered
  • Bing, Apple, and other engines have weaker or no JS rendering — CSR sites may miss these entirely
  • Content visible only after JS execution is a Wave 2 dependency — this is the core SEO risk

Client-Side Rendering vs Server-Side Rendering vs Static Site Generation

The rendering strategy your site uses determines whether Googlebot sees your content in wave one or must wait for wave two — or may never see it at all. Client-Side Rendering (CSR) means all content is rendered in the browser via JavaScript. The server delivers an HTML shell and a JavaScript bundle; the browser executes JS to build the page. From Googlebot's perspective, the raw HTML is empty. This is the highest-risk approach for SEO. Frameworks using CSR by default include Create React App, Vue CLI (default), and Angular without universal rendering. Server-Side Rendering (SSR) means the server executes JavaScript and delivers fully rendered HTML to the client and to Googlebot. Content is visible in wave one. SSR frameworks include Next.js (getServerSideProps), Nuxt.js (SSR mode), and Angular Universal. Static Site Generation (SSG) pre-renders all pages to HTML at build time. Content is fully available in wave one. Fastest for SEO. Next.js (getStaticProps), Gatsby, Astro, and Hugo use SSG. Incremental Static Regeneration (ISR) is Next.js-specific — pages are statically generated but can be regenerated in the background after a set interval. ISR combines the SEO benefits of SSG with the ability to serve fresh content.

  • CSR: highest SEO risk — all content is Wave 2 dependent, avoid for content-heavy sites
  • SSR: content in Wave 1 HTML — best for dynamic content that changes per request
  • SSG: fastest indexing, all content pre-built — best for content-heavy sites with infrequent updates
  • ISR (Next.js): SSG with background regeneration — best balance for large content sites
  • Hybrid rendering: Next.js and Nuxt allow per-page rendering strategy — use SSG for SEO pages, SSR or CSR for app-like pages
  • Never use CSR for pages you need indexed: product pages, blog posts, landing pages, category pages

How to Diagnose JavaScript SEO Problems on Your Site

Diagnosing JS SEO issues requires comparing what Googlebot sees versus what a user sees in a browser. The most direct method is Google Search Console's URL Inspection tool — click 'Test Live URL' and then 'View Tested Page' to see the HTML as Google rendered it, alongside a screenshot of the rendered page. If the screenshot shows your full page but the HTML tab is missing key content, Google is seeing content in wave two but the raw HTML lacks it — this creates indexing delays. If both the screenshot and HTML are missing content, rendering is failing entirely. Screaming Frog can render pages in two modes: without JavaScript (like Googlebot wave one) and with JavaScript (like a browser). Compare the two crawl reports — any content in the JS crawl but not in the non-JS crawl is wave-two dependent. Chrome DevTools is also essential: disable JavaScript (Settings > Debugger > Disable JavaScript) and reload your page. What you see is roughly what Googlebot sees in wave one. Missing navigation, products, or articles when JS is disabled indicates CSR architecture that needs addressing.

  1. 1Use GSC URL Inspection — compare HTML source vs rendered screenshot for key pages
  2. 2Run Screaming Frog in both JS-disabled and JS-enabled modes, compare content extraction
  3. 3Disable JavaScript in Chrome DevTools and reload page — missing content is Wave 2 dependent
  4. 4Check response size in logs — page returning 3KB when browser shows 45KB indicates empty HTML shell
  5. 5Use Google's Rich Results Test to verify structured data (which must be in raw HTML) is visible
  6. 6Check Lighthouse accessibility tree with JS disabled to understand what Google sees

Fixing JavaScript SEO in React and Next.js Applications

Next.js is the most SEO-friendly React framework because it supports SSR, SSG, and ISR out of the box with per-page granularity. For a Next.js app with CSR pages that need indexing, the fix is typically converting getInitialProps to getStaticProps (for SSG) or getServerSideProps (for SSR). For pages fetching data from an API, getStaticProps with revalidate enables ISR. For React applications not using Next.js — built with Create React App or Vite — the options are: migrate to Next.js, implement a prerendering service, or use dynamic rendering. Dynamic rendering detects Googlebot's user agent and serves a server-rendered version of the page exclusively to bots while serving the CSR version to users. Tools like Prerender.io and Rendertron provide dynamic rendering as a service. This is a pragmatic workaround but requires correct implementation to avoid cloaking violations — the server-rendered content must match the CSR content exactly. Avoid dynamic rendering as a permanent solution; treat it as a bridge while migrating to SSR or SSG.

  • Next.js: use getStaticProps for SSG on content pages, getServerSideProps for dynamic data pages
  • Add revalidate to getStaticProps for ISR — set revalidate value based on content update frequency
  • Ensure all meta tags (title, description, og:image) are in server-rendered HTML, not client-rendered
  • Check _app.js and _document.js for any client-only head tag implementations
  • Structured data (JSON-LD) must be in the initial HTML response, not injected client-side
  • Use next/head for all SEO-critical head content — it renders server-side by default in Next.js

JavaScript SEO in Angular and Vue.js Applications

Angular applications are CSR by default, making them SEO-unfriendly out of the box. Angular Universal is Angular's official SSR solution — it renders Angular applications on a Node.js server, delivering full HTML to Googlebot in wave one. Implementing Angular Universal requires setting up a server-side app module, an Express server to handle requests, and ensuring that browser-only APIs (window, document, localStorage) are not called during server-side rendering — these do not exist in a Node.js environment and will cause Universal rendering to fail. Vue.js applications face similar challenges. Nuxt.js is the Vue equivalent of Next.js and provides SSR and SSG modes. A Vue CLI application without Nuxt is CSR by default. The migration path from a Vue CLI app to Nuxt.js is more involved than switching rendering modes but is the most reliable long-term fix. For both Angular and Vue, a common intermediate fix is using a prerendering tool like Prerender.io at the reverse proxy level (Nginx or a CDN like Cloudflare) to serve static HTML snapshots to Googlebot while the app continues to run in CSR mode for users.

  • Angular Universal: official SSR for Angular — requires Node.js server and browser API guards
  • Nuxt.js: use SSR mode for dynamic content, SSG mode for static content — configure per page
  • Vue CLI (default CSR): migrate to Nuxt.js or implement Prerender.io at reverse proxy level
  • Guard browser-only APIs in SSR: wrap window, document, localStorage calls in isPlatformBrowser checks
  • Test SSR rendering by disabling JavaScript in Chrome after SSR implementation
  • Monitor TTFB (Time to First Byte) after SSR implementation — SSR adds server-side processing time

Metadata, Structured Data, and Canonical Tags in JavaScript Frameworks

Title tags, meta descriptions, canonical tags, Open Graph tags, and structured data all have the same requirement: they must be present in the server-rendered HTML, not added client-side after JavaScript execution. This is a common mistake in JavaScript applications. A React component that sets document.title on mount is CSR — Google's wave one crawl will miss the title. Libraries like React Helmet, Vue Meta, and Angular Meta Service can manage head tags, but whether they render server-side depends on your SSR setup. In Next.js, use the next/head component or the metadata API (App Router) — these are guaranteed to render server-side. For structured data (JSON-LD), the schema markup must be in the initial HTML response. Injecting JSON-LD via a useEffect hook means it only runs in the browser — Googlebot's wave one crawl will not see it, and wave two rendering may still miss it if the rendering budget is exceeded. Always add JSON-LD directly in your SSR page components or through a metadata configuration that runs server-side.

  • Title and meta description must be in initial HTML — not set via document.title in useEffect
  • Next.js metadata API (App Router) or next/head: both render server-side correctly
  • JSON-LD structured data must be server-rendered — never in useEffect or componentDidMount
  • Canonical tags injected client-side may be seen in Wave 2 but Wave 1 will be canonical-less
  • Open Graph tags must be in server HTML for social sharing to work (social bots don't render JS)
  • Verify all meta tags are present in raw HTML source (Ctrl+U in browser, not DevTools Elements)

Core Web Vitals and JavaScript Performance for SEO

JavaScript performance directly affects Core Web Vitals, which are a confirmed Google ranking factor. The three metrics are Largest Contentful Paint (LCP), Interaction to Next Paint (INP, replacing FID since March 2024), and Cumulative Layout Shift (CLS). LCP is most commonly impacted by JavaScript: if the largest content element on the page (hero image, headline) is rendered by JavaScript, LCP is delayed until the JS executes. Pages with LCP above 4 seconds are classified as 'Poor' by Google. INP measures responsiveness to user interactions — heavy JavaScript execution on the main thread during interactions causes poor INP scores. CLS is affected by JavaScript that injects content after initial page load, pushing existing content down. Key JavaScript performance optimisations for Core Web Vitals: code splitting (only load JS needed for current page, defer the rest), eliminating render-blocking scripts, preloading critical resources, using web workers to offload heavy computation off the main thread, and avoiding large layout shifts from dynamically injected components. The PageSpeed Insights tool and Chrome DevTools Performance panel are the primary diagnostic tools.

  • LCP threshold: under 2.5 seconds (Good), 2.5-4s (Needs Improvement), over 4s (Poor)
  • Code splitting: use React.lazy, dynamic imports, or webpack chunk splitting for non-critical JS
  • Render-blocking scripts: move to async or defer, or inline critical CSS instead
  • INP: reduce main thread blocking — move heavy computation to web workers
  • CLS: set explicit width and height on dynamically loaded images and ad slots
  • Use PageSpeed Insights field data (CrUX) to see real-user Core Web Vitals, not just lab scores

Common JavaScript SEO Mistakes and How to Fix Them

Across hundreds of JavaScript SEO audits, the same mistakes appear repeatedly. The most damaging: infinite scroll without HTML fallback — search engines cannot trigger scroll events, so any content that only loads via infinite scroll is invisible to crawlers. Fix: implement pagination as a fallback with rel=next/prev, or generate static URLs for each page of content. Lazy-loaded images without proper attributes: images with loading='lazy' that rely on Intersection Observer to load may never load in Googlebot's rendering environment. Fix: ensure important images have explicit src attributes in server HTML, not just placeholder data-src values. Single-page application routing without server-side rendering: a React Router or Vue Router application that handles all routing client-side means every URL returns the same empty HTML shell. Fix: implement SSR or prerendering so each route returns appropriate content. JavaScript redirects (window.location.href) instead of server-side redirects: these are slower, may not pass PageRank correctly, and are unreliable in rendering environments. Fix: implement all redirects as 301 server-level redirects.

  • Infinite scroll: always provide HTML-accessible pagination fallback for crawlers
  • Lazy images: use native loading=lazy with explicit src (not data-src) for critical above-fold images
  • Client-side routing: ensure each route serves unique, crawlable content via SSR or prerendering
  • JavaScript redirects: replace all window.location redirects with server-side 301 redirects
  • JavaScript-gated content: content behind login, cookie consent, or JS-triggered tabs may not index
  • A/B testing scripts: ensure control variant serves SEO-correct content — variants can cause canonicalization issues

JavaScript SEO Audit Checklist for Developers

A practical audit checklist to systematically identify and fix JavaScript SEO issues. This checklist should be run before any major release and at minimum quarterly for JavaScript-heavy applications. Check raw HTML source for all critical content, meta tags, and structured data. Verify Core Web Vitals scores in PageSpeed Insights using field data. Run Screaming Frog in both non-JS and JS mode, compare content extraction. Test URL Inspection in GSC for key pages. Check rendering completeness by comparing Googlebot screenshots to browser view. Audit for infinite scroll fallbacks on any paginated content. Verify all important images have explicit src attributes. Confirm structured data validates in Rich Results Test. Check that internal links are standard anchor tags not JS-click handlers. Verify canonical tags are in server HTML. Confirm all meta robots directives are server-rendered. Test JavaScript disablement in Chrome for navigation functionality. Review Lighthouse performance scores for all priority page templates.

  1. 1Check raw HTML (Ctrl+U) of 10 representative pages for content, title, meta, canonical, JSON-LD
  2. 2Run Screaming Frog: non-JS crawl vs JS crawl, compare word count and link extraction
  3. 3GSC URL Inspection: test 10 key URLs, review rendered HTML and screenshot
  4. 4PageSpeed Insights: record LCP, INP, CLS for homepage and top landing pages
  5. 5Disable JavaScript in Chrome, test full navigation and content visibility
  6. 6Run Rich Results Test on pages with structured data to verify schema visibility
  7. 7Cross-reference Googlebot crawl frequency from logs against critical page list

JavaScript SEO is not about avoiding JavaScript — it is about understanding how search engines interact with JavaScript and making deliberate architecture decisions that ensure your content is accessible. The core principle is simple: critical content (product descriptions, articles, metadata, structured data) must be in server-rendered HTML. The rendering strategy that achieves this most reliably is SSG for content that does not change per request, and SSR for dynamic content. For existing JavaScript applications with CSR architecture, prerendering is the fastest fix while migrating to a proper SSR framework. Audit your JavaScript rendering quarterly — changes to your JS framework, dependencies, or rendering pipeline can introduce new issues silently.

Frequently Asked Questions

Does Google fully render all JavaScript on every page?

No. Google renders JavaScript in a second wave that can be delayed hours to weeks. Additionally, Google uses an older Chrome version (currently Chrome 112) that may not support the latest JavaScript features. Very heavy JS pages may have rendering terminated before completion due to resource limits. Critical content should always be in server-rendered HTML, not solely dependent on JavaScript execution.

Is Next.js the best framework for SEO?

Next.js is currently the most SEO-friendly React framework due to its hybrid rendering model — you can choose SSG, SSR, ISR, or CSR per page. For purely content-focused sites, Astro (which ships zero JavaScript by default) or Gatsby (SSG-first) may be preferable. The best framework for SEO is the one that ensures your content is in server-rendered HTML — Next.js just makes this easiest for React developers.

Can I use Create React App (CRA) for a site that needs to rank on Google?

CRA is CSR-only, which means all content is rendered client-side. Google can eventually index this content through wave two rendering, but indexing is slower, less reliable, and misses other search engines. For any site that needs SEO, migrate from CRA to Next.js or add a prerendering layer. CRA is suitable for authenticated app sections that don't require indexing.

What is dynamic rendering and is it a cloaking violation?

Dynamic rendering serves server-rendered HTML to crawlers and client-side rendered content to users. Google officially documented this as an acceptable workaround in 2018 and does not treat it as cloaking, provided the content served to bots is identical to the content served to users — not a stripped or keyword-stuffed version. Use Prerender.io or Rendertron for implementation. Google recommends moving to SSR or SSG as a long-term replacement.

How do I ensure structured data is indexed when using a JavaScript framework?

Structured data (JSON-LD) must be in the initial server-rendered HTML response, not injected via useEffect, componentDidMount, or any client-side lifecycle method. In Next.js, add JSON-LD in your page component's return statement or via the metadata API. Verify it is present by viewing raw HTML source (Ctrl+U) — if you only see it in DevTools Elements panel (which shows the DOM after JS execution), it is client-side and may not be reliably indexed.

Do Core Web Vitals affect rankings for JavaScript sites specifically?

Core Web Vitals are a ranking factor for all sites, but JavaScript-heavy sites typically struggle more with LCP and INP due to the render-blocking nature of large JS bundles. A CSR React app with a 400KB JS bundle will have significantly worse LCP than an equivalent SSG page delivering pre-built HTML. Optimising JavaScript performance — code splitting, tree shaking, critical CSS inlining — is therefore both a performance and ranking improvement for JS-heavy sites.

How can I test what Googlebot actually sees on my pages?

Three methods: first, Google Search Console URL Inspection tool — click 'Test Live URL' then 'View Tested Page' to see the HTML and screenshot Google recorded. Second, Screaming Frog with JavaScript rendering disabled — this mimics Googlebot wave one. Third, disable JavaScript in Chrome DevTools (Settings > Debugger > Disable JavaScript) and reload the page. All three methods should show the same critical content for your SEO to be reliable.

Take the Next Step

Turn These Insights Into Real Results for Your Business

Our team audits your website, ad accounts, and SEO performance — for free — and tells you exactly where your leads are being lost and what it will take to fix it.