When Headless-Render-API (FKA Prerender.cloud) first launched in 2016, alongside Headless Chromium, React/Angular (JavaScript) clients were becoming very popular but Googlebot was still unable to index them. Pre-rendering was a generic solution to this problem, it worked well, and Headless-Render-API was the first service to offer the faster headless Chromium instead of PhantomJS for generic pre-rendering.
But in recent years, official Google representatives have said that Googlebot can index JavaScript sites.
Once Google's resources allow, a headless Chromium renders the page and executes the JavaScript
However, they also admit:
Keep in mind that server-side or pre-rendering is still a great idea because it makes your website faster for users and crawlers, and not all bots can run JavaScript.
SEO consultants will say "if it renders in Google Search Console URL inspect then it's working", but just because it works in the console doesn't mean they're indexing it. This is generally referred to as a "crawl budget"
In practice, it seems server-side rendering (or pre-rendering via headless browser) is still necessary.
It's frustrating that we can't get clear answers, so the best we can do is monitor the Search Console on a site-by-site basis, and add pre-rendering or SSR if Google isn't indexing a particular JavaScript rendered site.
However, separate from the ambiguity of Google's indexer, it is still necessary to pre-render (or SSR) to get link previews for social media crawlers like Facebook, X/Twitter, LinkedIn, etc. So in the end, you probably still need a pre-rendering or SSR solution.
If you want to pre-render meta tags only (and allow Google to index the JavaScript rendered content), you can use our prerender-meta-only
feature (use the prerender-meta-only: true
HTTP header, or prerendercloud.set("metaOnly", true)
if using our JavaScript SDK).