Escaped Fragment SEO

Web developers love AJAX to build Single Page Applications using popular frameworks like React and Angular. AJAX implementation can create interactive, user-friendly, smooth web application which behaves like a dedicated desktop application.

But the user-friendliness and functionality of AJAX sites come with damaging consequences for your site’s SEO. For search engine crawlers, AJAX isn’t that different from Flash, especially considering a broker website navigation, which loads all pages in the same address, cloaking issues and useless back, next & reload buttons.

AJAX SEO – Can Google Crawl AJAX?

This reputation of AJAX is changing, however, with most search engines coming up with solutions for webmasters and helping improve user experiences. This hasn’t attracted much attention so far, and most AJAX websites still remain unoptimized or un-indexed in Google search results. If you’re site is one of the many, this guide should help you get included.

Google Engineers have been constantly working to better crawl and index AJAX pages, and in the process, have come up with numerous solutions, such as the escaped fragment in the URL, which helps 1000s of AJAX sites get Google visibility, and off late, they’ve started pre-rendering of these pages on Google’s side itself for a much better user experience.

Escaped Fragment Prerender

The prerender.io offers a middleware installed on a server to check each of the crawler requests. If the request is detected to be coming from a crawler, the middleware sends a request for static HTML of the particular page. If not, the request proceeds to standard server routes.

SEOs often face a dilemma when using AJAX to build their websites. These sites load content into the page much quicker while providing a great user experience. But such websites are not crawled by Google, thereby harming the SEO of the site. Fortunately, Google has announced a proposal that helps webmasters get the best of both the worlds. The websites following the proposal should make two versions of their content –

  1. For JS-enabled users with ‘AJAX-style’ URL
  2. For search engines with traditional static URL

The new protocol requires using a hash and exclamation mark – #! and is called ‘hashbang’. When you use the hashbang in a page URL, Google can identify that you are following the protocol and it interprets the URL in a special way. It takes everything after the hashbang and pass to the website as a URL parameter. They use the name _escaped_fragment_ for the parameter. Next, Google will rewrite the URL and request content from the static page.

There are two requirements for the solution to work. First is the site opting in to the AJAX crawling scheme to have the crawler request ugly URLs. This is achieved by adding a trigger to the page head. If the page does not include hashbang but includes the directive in the head, escaped.

To summarize, there are numerous ways of getting your AJAX website well positioned in Google, it starts off by telling Google that you’re pages are rendered by Javascript, following which Google requests your URL escaped fragment parameters, before you serve static HTML for the crawlers.

It might seem complicated, but there are tools to make this simple and Google itself has numerous guides and tools to help you out with this.