How client-side rendering became the default
The move to client-side rendering made sense when it happened. React changed how people thought about building interfaces. Vue and Angular followed. The pitch was real: better developer experience, clean separation between frontend and backend, cheap static hosting. You could deploy a bundle to a CDN, let the browser do the heavy lifting, and scale without managing servers.
The technical outcome was a generation of websites that, on first request, deliver almost nothing. A thin HTML shell, a JavaScript bundle, and a blank container waiting to be filled. The content assembles in the browser once the script runs.
For a human visitor with a modern browser, this is invisible. For a crawler that does not execute JavaScript, it is the whole story.
What AI crawlers actually see
Googlebot operates on a two-pass system. It fetches the raw HTML first, then returns later to render JavaScript. Slow, but eventually complete.
GPTBot, ClaudeBot, and PerplexityBot work differently. They fetch the page once, read what is in the HTML response, and stop. An analysis of 569 million GPTBot requests found zero evidence of JavaScript execution. The crawlers sometimes download JavaScript files, but they do not run them. [Source: AI Crawlers & JavaScript Rendering, SearchViu 2025]
For a client-side rendered site, that means these crawlers see nothing. An empty container and a script tag. No headings, no body copy, no structured content of any kind.
A gap no dashboard flags
What strikes me about this is how invisible it is. A site can hold reasonable search rankings, Google eventually renders the JS, and simultaneously have zero presence in AI-generated answers. Search Console shows no problem. Nobody sends an alert.
You can verify the gap using citation tracking tools like Promptwatch or Rankshift (amongst others), which show whether your content is actually appearing in AI-generated responses. Most teams I talk to have not checked.
The reason to check now rather than later is the compounding effect. AI systems tend to cite sources that have been cited before. Early visibility reinforces itself. A site that is invisible to AI crawlers today is not simply missing some traffic, it is also less likely to accumulate the citation history that would help it appear in future. GEO (generative engine optimisation) is the name that has stuck for this problem, and unlike traditional SEO, it depends directly on what is in the initial HTML response. Server-side rendering is not optional for it.
The edge argument
The standard case against SSR was always infrastructure overhead. More server load, more latency, harder to scale. It was a fair concern in 2016.
Edge computing has made it obsolete. Cloudflare Workers runs across more than 300 data centres globally and executes in under 5 milliseconds. SSR at the edge delivers the initial HTML response faster than a client-side bundle can download and run in the browser. The latency argument has reversed. I covered the edge performance case in more detail in the context of personalisation (Your personalisation is costing you sales) and the same infrastructure logic applies here.
The cost model has also shifted. Edge computing is pay-per-use. No idle servers, no capacity planning. The infrastructure objection that made SSR a hard sell for years no longer holds the same weight.
The practical check
There is one thing worth doing before anything else: find out what your site actually delivers on first request.
Open it with JavaScript disabled, or look at the raw HTML in DevTools before scripts execute. If the content is there: headings, body copy, structured information, you are in a workable position. If you see an empty container and a bundle reference, that is the gap.
Modern frameworks handle this differently. Remix, Next & Astro default to server-side or static rendering. A standard Create React App setup is pure client-side. Nuxt and SvelteKit support SSR but need deliberate configuration. The question is not which framework your team chose. It is more about which mode it is running in.
If you want to go further than rendering and think about how AI agents interact with your site structurally, that is a separate but related question, one I looked at in Your website was built for humans. AI agents have different needs.
