Should we drop AJAX crawling scheme?

So now Google has deprecated AJAX crawling scheme. They say not to bother implementing it in new websites, because it’s no longer needed since Googlebot has now no problem watching dynamic content. Should we immediatly trust this statement, or better to adhere to the deprecated standard for a while?

Here is Solutions:

We have many solutions to this problem, But we recommend you to use the first solution because it is tested & true solution that will 100% work for you.

Solution 1

Several other search engines (Bing, Yandex, etc.) still use the _escaped_fragment_ system. They’re not going to stop using it overnight just because Google has. Thus, if you care about your site being indexable by search engines other than Google, you may want to still support this scheme.

Certainly, if you already have set up support for _escaped_fragment_ on your site, there’s no reason to disable it. If you’re developing a new site, you’ll need to weigh the cost of adding this feature against the benefits (keeping in mind that Google currently has a near-monopoly on Internet search, and that in any case, other search engines will likely soon scramble to follow Google’s example and implement better crawling of dynamic Ajax-loaded content).


Finally, note that in most cases, the simplest and most foolproof solution is to implement your site so that it doesn’t need such tricks in the first place. At least 99% of the time, you don’t really need any Ajax, or even client-side scripting at all. By avoiding unnecessary reliance on Ajax, and by designing your site so that at least basic browsing features work even with JavaScript disabled, you’ll ensure the widest possible compatibility across browsers and search engines.

The trick to doing this efficiently is to first set up the basic functionality of your site using basic HTML and CSS and plain old links, with no JS at all. Once you’ve done that, you can add JS and Ajax on top of that for smoother loading and extra features, while still retaining a graceful fallback interface for users and search engines who don’t support the extra features. If you start out relying on Ajax for everything, however, retrofitting a non-Ajax fallback interface later can be very difficult and awkward.

Solution 2

Google already crawls and processes JavaScript so there is no need to implement the AJAX crawling scheme in new sites.

Note: Use and implement solution 1 because this method fully tested our system.
Thank you 🙂

All methods was sourced from stackoverflow.com or stackexchange.com, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply