Recently as I’ve been working on (in swedish), which is entirely an React app, and other single page applications. I’ve struggled with SEO. The main reason being Googlebot is simply not yet smart enough to realize how to render a SPA properly in all scenarios.

It is true, that it sometimes manages to wait enough for the main site to load, but many times, all you will get when donig “Render & Fetch” in Webmaster Tools will be a blank render showing the loading bar.

Googlebot only renders the loading screen, screenshot from Google Webmaster Tools
Look at the screenshot to the left, this is going to be a problem.

At first, I was hopeful that it would still index correctly. Occasionally it appears to work, Googlebot has picked up keywords that are only present in the React application, and sometimes Render & Fetch will do the correct thing. Google has clearly it has realized that there is more stuff on the page.

However, this is not universally true that it can render it. During my time continously developing the page. Many terms have in fact not been picked up by Googlebot. And the most popular keyword for the site according to Webmaster Tools is “loading”. That is not great.

To resolve the issues, my first approach was to put some static HTML containing the menu, with <a> tags linking to the different subpages inside the React div that would be replaced with the app on load time. The HTML was almost identical to what would be outputted later by React, and was a “poor-mans server-side rendering”. My hope was them being in the HTML directly would help Google to find the other pages and index them. As otherwise it appeared to be no links at all on the front page when it was first loaded.

This was in vain.

The next approach was creating more text-heavy pages. Because presents political polls, all the pages are almost identical in structure, with huge charts everywhere. These are tough for to index. To remedy this I created a FAQ page with lots of text for Google to gobble up. This was not a success either, even a month after I added it to the page Google had not indexed any of the text on it. And neither has any of the other pages on the site had been added to the overview (despite them being in the Sitemap).

Shows 14 pages submitted via sitemap, an only 1 being indexed, screenshot from Google Webmaster Tools
A sad sight for any webmaster.

Next step was to implement server-side rendering of React, meaning every page is pre-rendered with the complete HTML that the client-side React will render when it runs. And this works! I had some struggles with getting React to render to a static HTML file (and not on-demand server-side), but once I finally got it working it has been smooth, and now finally Googlebot picks up all the keywords I want it to.

A graph that shows 5 pages indexed on 15th of May, screenshot from Google Webmaster Tools
Finally it seems Google picks up more than the front page.

In conclusion, server-side rendering should be seen as mandatory for single page applications as of today’s date. Even if Google sometimes manages to render the page with all JS working, it is not 100 % reliable and it will fail to properly index your entire site if you rely on it.