Homepage > Journal > JavaScript SEO: Does Google actually treat JavaScript-based websites worse?
Journal

JavaScript SEO: Does Google actually treat JavaScript-based websites worse?

How you like that:

Vercel company, the author of the Next.js framework, has conducted a study and debunked the myth about Google indexing JavaScript-based websites slower and worse than others.

In the SEO community, several myths have taken root over the years regarding the indexing of websites based on JavaScript and their search engine optimization. Namely, there is a prevailing opinion that Google indexes such sites slower and worse in comparison with other websites, which may affect the overall JavaScript SEO results.

Vercel partnered with MERJ, a consulting company specializing in SEO and data engineering, to debunk these myths. Together, they conducted a study to analyze Googlebot’s behavior and how it crawls sites.

In this article, we’ll present the results of their study on JavaScript SEO and the conclusions they reached.

Key information

  • There were no noticeable differences in how Google renders JavaScript-based sites.
  • Google doesn’t use separate criteria for indexing pages with JavaScript.
  • Rendering websites written in JavaScript doesn’t generate long queues (lasting several days).
  • Google doesn’t discover JavaScript-heavy sites slower.

We develop websites that boost sales

What are Googlebot’s capabilities?

Over the years, Googlebot’s crawling capabilities have changed many times and continue to change. In the past, website indexing was heavily limited; only static HTML sites were well-visible to the browser. The overall JavaScript SEO results left much to be desired.

A 3D graphic showing a small robot, with a magnifying glass levitating next to it, and browser tabs visible in the background

The situation gradually changed along with the evolution of technology; it progressed through introducing the AJAX crawling scheme and site rendering involving the headless Chrome browser, and now we’re here.

According to Vercel, the current Google indexing system is characterized by the following:

  • Renders all HTML pages, not only their subsets
  • Supports JavaScript features, which improves JavaScript SEO
  • Renders pages in a new browser session without preserving cookies or the previous rendering state
  • Doesn’t click on cookie banners or “hidden” content (e.g., hidden behind an accordion menu)
  • Prohibits practices related to changing the site’s content with the use of the User-Agent header, which enables the website to display different content for users and the browser to manipulate rankings
  • Speeds up site rendering by caching assets

Vercel analyzed over 100,000 Googlebot fetches of various web pages to test its capabilities and describe its behavior.

How Vercel debunked 4 myths regarding JavaScript SEO and indexing of JavaScript websites?

The study’s scope focused on the following websites: nextjs.org, monogram.io, and basement.io. The study itself started on 1 April 2024 and lasted 30 days.

Graphic with JavaScript logo, next to it you can see a block diagram

For the study, Vercel used its own solution, Edge Middleware, to intercept and analyze requests generated by search engine bots.

During the study, they analyzed 37,000 rendered sites, focusing mainly on the data coming from Googlebot.

Vercel started debunking myths about JavaScript SEO and crawling of JavaScript-based pages based on this analysis.

Can Google render content with JavaScript?

When studying Google’s capabilities to render JavaScript content, Vercel considered the following elements:

  • Compatibility with the Next.js framework
  • Dynamic content indexing
  • Streamed content with React Server Components
  • The metric for rendering success rate

3D graphics showing different types of charts with SEO tab in the background

In summary, the company analyzed how Googlebot behaves during the interaction with the JavaScript framework on nextjs.org. The site uses rendering strategies such as static rendering, client-side rendering, and server-side rendering.

They tested pages on nextjs.org that use API calls for asynchronous content loading to check whether Googlebot crawls content that isn’t present in the initial HTML response. Additionally, they tested if the bot could process streamed content.

Vercel determined the rendering success rate by comparing the number of Googlebot’s requests in server logs to the number of successfully received rendering beacons. Thanks to this, they received a percentage of fully rendered and indexed sites.

They concluded that 100% of 100,000 Googlebot fetches on nextjs.org were fully rendered, including pages containing complex JavaScript interactions (they didn’t include pages with noindex tag and status code errors).

They also confirmed Google’s ability to index pages loaded asynchronously and the full rendering of the Next.js framework. The rendering wasn’t disrupted by the streamed content in any way.

Additionally, Vercel discovered that Googlebot tries to render all HTML pages during crawling, not just JavaScript-heavy subsets.

Does Google treat JavaScript-based websites differently?

Another myth that Vercel tried to debunk was the opinion that Google has special criteria for indexing websites based on JavaScript (JavaScript SEO).

A graphic showing a computer displaying code on its screen, next to it fly tiles with the words “HTML”, “PHP”, “CSS”, “JS”

To test it, the company used a site without JavaScript code, but that contained a CSS file that imported another CSS file. Then, it compared the rendering behavior to the site with enabled JS support.

The results of this test confirmed that Google renders pages with JavaScript support the same way it does with websites without JavaScript code.

Next, Vercel created an application based on Next.js to test how Google renders pages when HTML status codes occur and when pages have the noindex tag enabled.

This test showed that Google can render all 200 status codes despite JavaScript content. Based on the 200 status code content, Google rendered pages with a 304 status code, and web pages displaying 300, 400, or 500 status codes weren’t rendered.

Pages containing noindex tags weren’t rendered even when JavaScript code meant to delete the tag on the client side was added. If Googlebot reads the noindex tag in the initial HTML response, it won’t render the page.

Moreover, Vercel checked whether the indexing process differs depending on the complexity level of the JavaScript code. The test included pages with a minimal number of JS interactions, a moderate amount, and dynamic pages with extensive rendering on the client side.

Additionally, the time between the first crawl and completed render was compared to test whether the more complex JS code leads to longer rendering queues or processing time.

No significant changes in successfully completed renderings were found. There was also no link between the JS code's complexity level and delayed rendering. However, Vercel emphasized that writing complex JavaScript code for a bigger website has a chance of affecting crawling efficiency, which might influence the overall JavaScript SEO score.

Is there a rendering queue, and does it affect JavaScript SEO?

Vercel’s study also tested the dependency between JavaScript-heavy sites and the rendering time.

A 3D graphic showing a browser tab with the word SEO and chart elements

Their test included aspects like checking how much time passes between the first Google crawl and a finished render, what types of URL (with query strings and without them) and different page sections influence the rendering speed, and how often Google re-renders pages and whether we can observe patterns in the frequency of rendering of different page sections.

Vercel determined that the average time difference during the rendering of 37,000 pages looks as follows:

  • 25th percentile of pages is renderer during the first 4 seconds
  • 50th percentile of pages is rendered after 10 seconds
  • 75th percentile renders after 26 seconds
  • 90th percentile is ready after around 2 hours
  • 95th percentile renders after 6 hours
  • 99th percentile is ready after 18 hours

After seeing this data, we can already observe that the myth about long queues prevailing in the JavaScript SEO community doesn’t hold any water. Vercel also emphasized that pages rendering after 18 hours are more of an exception to the rule.

A graphic with the word HTML on a yellow background.

As for URLs without query strings, the rendering time looks as follows:

  • 50th percentile after 10 seconds
  • 75th percentile after 22 seconds
  • 90th percentile after around 2.5 hours

For comparison, URLs with query strings look like this:

  • 50th percentile renders after 13 seconds
  • 75th percentile after 31 minutes
  • 90th percentile after around 8.5 hours

The collected data showed that Google treats URLs with query strings that don’t change the content of a page differently, and their rendering gradually slows down.

Additionally, Vercel observed that the more often a given section of a page is updated, the faster it’s rendered compared to more static pages, where re-rendering took longer. This doesn’t additionally affect JavaScript SEO; it just means that Google focuses on rendering new content.

The data also don’t indicate that indexing of pages lasts several days, contrary to popular belief.

Does Google discover JavaScript-heavy pages slower?

The last myth Vercel faced was the belief that Googlebot discovers JavaScript-heavy pages, especially those that use client-side rendering, slower.

A 3D graphic with Firefox browser visualization

To debunk this myth, Vercel conducted tests that compared link discovery on pages with different rendering strategies, such as server-side rendering, client-side rendering, and static rendering.

They also tested how Google discovers links for JavaScript pages that weren’t previously rendered and compared how quickly Google discovers pages linked in different ways: HTML links, links in content rendered on the client side, and links in JavaScript content that wasn’t previously rendered.

The results of this test showed that Googlebot discovers links in rendered pages without issue, regardless of the selected rendering method. It can also find links in JavaScript payloads that weren’t previously rendered.

Despite that, it was noted that pages rendered on the client side must finish the rendering process before Google discovers these links. This gives pages rendered on the server side and pre-rendered ones a slight advantage.

A 3D graphic showing three browser tabs in different colors

Vercel also discovered that Googlebot doesn’t prioritize link crawling, regardless of whether it found them during the first crawl or after rendering the page. Moreover, Googlebot evaluates the value of links for the site architecture and prioritizes crawling only after it renders the entire page.

In summary, taking care of frequent site map updates allows us to reduce the time it takes Googlebot to discover links and connections between them. Moreover, checking pages with the Google Search Console is crucial to ensure our chosen rendering strategy doesn’t cause issues.

Vercel's general recommendation is to focus on JavaScript SEO best practices defined by Google, such as ensuring short page loading time or creating content with users in mind. For example, big JavaScript files will significantly slow page loading time (from the user’s perspective).

Summary

Vercel and MERJ conducted extensive testing that allowed them to debunk popular myths rooted in the JavaScript SEO community. Thanks to their findings, we now know that Google doesn’t discriminate against websites written in JavaScript in any capacity. Its rendering and crawling process also doesn't differ from the norm.

The differences regarding the rendering and indexing processes in client-side rendered pages (the Googlebot’s efficiency in their case is slightly worse) are small enough that they don’t matter much if we’re taking care of more important SEO factors, such as page loading speed.

The findings of this test will undoubtedly come in handy for organizations that want to invest in the website optimization process in terms of JavaScript SEO and improve their ranking in Google search results.

The most frequently asked questions

Do the rendering and indexing processes of pages written in JavaScript differ from sites written in other programming languages?

Pages created with JavaScript are rendered and indexed the same way as sites developed in other languages. The current version of Google Chrome supports all JavaScript features.

Do pages written in JavaScript are subject to longer rendering queues?

JavaScript pages aren’t subject to longer rendering queues, and the rendering time usually doesn’t exceed several hours. However, it’s worth noting that Google renders URLs with query strings longer because their content doesn’t change frequently. It’s also worth remembering that optimizing JavaScript files increases rendering efficiency.

Does the presence of JavaScript influence Google's discovery of new pages?

The presence of JavaScript and the complexity of the code don’t influence Google's discovery of new pages or the overall JavaScript SEO score. The only thing worth noting is that when using client-side rendering, Google discovers URLs after the rendering process is complete.

How you like that:
Journal / Redaktor
Author: Radek
UX Writer and researcher by education + experience. Collects The Story's knowledge and shares it on the Journal.

Are you interested in working with us? Take a look at our Portfolio