Article

x

Two Point O

The Rendering Pendulum: Why serverside rendering matters for marketeers

A profile picture of Bert Swinnen
Bert Swinnen
AI
Technology
Agentic web

Introduction

The web has always had an opinion about where rendering should happen. That opinion shifted decisively around 2015, and most teams have been living with the consequences ever since. One of those consequences is now becoming a visibility problem and it is not the one anyone was watching for.

How client-side rendering became the default

The move to client-side rendering made sense when it happened. React changed how people thought about building interfaces. Vue and Angular followed. The pitch was real: better developer experience, clean separation between frontend and backend, cheap static hosting. You could deploy a bundle to a CDN, let the browser do the heavy lifting, and scale without managing servers.

The technical outcome was a generation of websites that, on first request, deliver almost nothing. A thin HTML shell, a JavaScript bundle, and a blank container waiting to be filled. The content assembles in the browser once the script runs.

For a human visitor with a modern browser, this is invisible. For a crawler that does not execute JavaScript, it is the whole story.

What AI crawlers actually see

Googlebot operates on a two-pass system. It fetches the raw HTML first, then returns later to render JavaScript. Slow, but eventually complete.

GPTBot, ClaudeBot, and PerplexityBot work differently. They fetch the page once, read what is in the HTML response, and stop. An analysis of 569 million GPTBot requests found zero evidence of JavaScript execution. The crawlers sometimes download JavaScript files, but they do not run them. [Source: AI Crawlers & JavaScript Rendering, SearchViu 2025]

For a client-side rendered site, that means these crawlers see nothing. An empty container and a script tag. No headings, no body copy, no structured content of any kind.

A gap no dashboard flags

What strikes me about this is how invisible it is. A site can hold reasonable search rankings, Google eventually renders the JS, and simultaneously have zero presence in AI-generated answers. Search Console shows no problem. Nobody sends an alert.

You can verify the gap using citation tracking tools like Promptwatch or Rankshift (amongst others), which show whether your content is actually appearing in AI-generated responses. Most teams I talk to have not checked.

The reason to check now rather than later is the compounding effect. AI systems tend to cite sources that have been cited before. Early visibility reinforces itself. A site that is invisible to AI crawlers today is not simply missing some traffic, it is also less likely to accumulate the citation history that would help it appear in future. GEO (generative engine optimisation) is the name that has stuck for this problem, and unlike traditional SEO, it depends directly on what is in the initial HTML response. Server-side rendering is not optional for it.

The edge argument

The standard case against SSR was always infrastructure overhead. More server load, more latency, harder to scale. It was a fair concern in 2016.

Edge computing has made it obsolete. Cloudflare Workers runs across more than 300 data centres globally and executes in under 5 milliseconds. SSR at the edge delivers the initial HTML response faster than a client-side bundle can download and run in the browser. The latency argument has reversed. I covered the edge performance case in more detail in the context of personalisation (Your personalisation is costing you sales) and the same infrastructure logic applies here.

The cost model has also shifted. Edge computing is pay-per-use. No idle servers, no capacity planning. The infrastructure objection that made SSR a hard sell for years no longer holds the same weight.

The practical check

There is one thing worth doing before anything else: find out what your site actually delivers on first request.

Open it with JavaScript disabled, or look at the raw HTML in DevTools before scripts execute. If the content is there: headings, body copy, structured information, you are in a workable position. If you see an empty container and a bundle reference, that is the gap.

Modern frameworks handle this differently. Remix, Next & Astro default to server-side or static rendering. A standard Create React App setup is pure client-side. Nuxt and SvelteKit support SSR but need deliberate configuration. The question is not which framework your team chose. It is more about which mode it is running in.

If you want to go further than rendering and think about how AI agents interact with your site structurally, that is a separate but related question, one I looked at in Your website was built for humans. AI agents have different needs.

FAQ

Server-side rendering (SSR) is crucial for ensuring your website is visible to AI crawlers, which is essential for appearing in AI-generated answers and maintaining a competitive advantage. By adopting SSR, you can future-proof your digital presence and ensure your content is discoverable by both human visitors and AI agents. This is particularly important as AI systems tend to cite sources that have been cited before, making early visibility a key factor in accumulating citation history and driving business outcomes.

Implementing SSR is more feasible than ever, thanks to advancements in edge computing. With solutions like Cloudflare Workers, which execute in under 5 milliseconds across over 300 data centres globally, the infrastructure overhead associated with SSR is significantly reduced. This pay-per-use model eliminates the need for idle servers and capacity planning, making it a more practical and cost-effective solution for businesses.

By optimizing your website for AI crawlers and adopting SSR, you can expect to see a measurable impact on your business outcomes, including increased visibility in AI-generated answers, improved citation history, and ultimately, more traffic and revenue. In fact, an analysis of 569 million GPTBot requests found that client-side rendered sites appear as empty containers to AI crawlers, highlighting the need for SSR to ensure content discoverability.

SSR drives innovation capabilities and competitive differentiation by enabling businesses to future-proof their digital presence and stay ahead of the curve in terms of AI adoption. By adopting SSR, businesses can ensure their content is discoverable by AI agents, providing a competitive edge in the market. This is particularly important as AI-generated answers become increasingly prominent in search results, making it essential to optimize for AI crawlers and ensure visibility in these results.

SSR is a critical component of a broader digital transformation strategy, as it enables businesses to optimize their digital presence for AI crawlers and ensure content discoverability. By adopting SSR, businesses can future-proof their digital presence, drive innovation capabilities, and maintain a competitive advantage in the market. This is particularly important as AI continues to evolve and play a more prominent role in shaping the digital landscape.

To assess your current setup and determine if SSR is necessary, simply open your website with JavaScript disabled or view the raw HTML in DevTools before scripts execute. If the content is present, you're in a workable position. If you see an empty container and a bundle reference, that indicates a gap that SSR can help address. This simple check can help you identify areas for improvement and determine the best course of action for your business.

Let's talk

Ready to transform your digital challenges?

Contact