
Making your AJAX Content SEO Friendly
August 25, 2016
This is an experiment to see if search engines are crawling AJAX sites and find better ways of keeping them SEO friendly.
AJAX or Asynchronous Javascript and XML is a technology born of a need to have better user experience. Before we were used to browsing link after link through a site. Sometimes those pages change so little on the page that it becomes tedious to having to go through the strokes of waiting for the page and all it’s images render.
In comes AJAX, the technique works by requesting the contents of a URL passed via an API call(XMLHTTPRequest) using Javascript. When that URL returns with its contents, the browser does not reload the whole page. Instead, the calling function/scripting decides what to do with it.AJAX can be loading a snippet into a block on the page or filling out a table with a schedule of data returned from an API call. It could be many things but we can agree that AJAX made interacting with the internet more “user-friendly”.
For websites, search engine optimization(SEO) is king. Without it, your websites would just be a folder of files and images without anyone to find it. Search engines make our site visible and it’s “findability” ranking is decided by some standards that must be met. In the beginning, you just need your description and keywords to have the right stuff in them to rank. Now search engines have made it their business to point out that it’s more and content is one of them.
So why did I bring up the topic of SEO, because getting your content after the page has loaded is an SEO problem? For the longest time, search engines like Google do not render Javascript enabled sites that well. So in 2009, Google proposed a scheme to help AJAX heavy sites get their content crawled. Web developers around the world were ecstatic! Brought on a slur of blog posts, leveraging their newfound power over search engines but it didn’t last. In 2015 they(Google) deprecated their crawling framework and announced that their crawlers can read AJAX and JavaScript-heavy sites just as modern browsers do. So there’s practically no need to add the effort of making sure your AJAX content has an escaped fragment etc.
To see whether this is true, I’ve devised a simple experiment. Below is a few sections where the content is loaded from AJAX calls, the objective is to publish this pages and see how well, if ever, Google can index the content on this page.
{% raw %}
CodeSpud AJAX Experiments
This section is loaded via AJAX. First section is an HTML snippet where I pull the article block into the page. Second section is generated markup based on extracted JSON data.
I will follow up with the results of my experiment in another post. If you have any ideas on how I can improve this experiment or additional items I need to consider, please post your comments below.
Special thanks to Genesis for the amazing AJAX logo I used for this post.