Click here to load reader
Upload
varun-grover
View
67
Download
1
Embed Size (px)
Citation preview
“Googleable” Ajaxed Content - Varun Grover
Background Most content is dynamic (not just static html pages).
Search is the primary navigation tool.
Google search is the defacto search tool for most people "google" the stuff they are looking for.
We are increasingly making use of Ajax to fetch and serve content.
Trouble with Ajax is that the URL doesn’t change. Often, the URL is appended with a hash mark (#).
HTTP spec does NOT allow UAs to send URL fragments in requests.
However, we do want the Ajaxed content to be crawlable for improved search ranking.
Google's new initiative A new standard for making AJAXbased websites crawlable.
Benefits businesses and users by making content from rich and interactive AJAXbased websites universally accessible through search results on “any search engine that chooses to take part”.
Some of the target goals behind this initiative:
Minimal changes are required as the website grows
Users and search engines see the same content (no cloaking)
Search engines can send users directly to the AJAX URL (not to a static copy)
Site owners have a way of verifying that their AJAX website is rendered correctly and thus that the crawler has access to all the content
The fundamentals Progressive enhancement
Well, if you already do this, this is not a valid problem at all!
Hijax
The implementation Indicate to the crawler that your site supports the AJAX crawling scheme
www.example.com/ajax.html#!key=value
Set up your server to handle requests for URLs that contain _escaped_fragment_
www.example.com/ajax.html?_escaped_fragment_=key=value
Handle pages without hash fragments
<meta name="fragment" content="!">
Consider updating your Sitemap to list the new AJAX URLs
Optionally, test the crawlability of your app: see what the crawler sees with "Fetch as Googlebot"
The bad parts Arbitrary. Its approach is akin to using a sledgehammer for nuts (pun is not intended).
One lesser reason for people to produce progressive, accessible, semantic websites.
In fact it can push adoption of WAIARIA guidelines down into the drain. Rather than forcing the site owners to make the sites available to “all” users, it allows them to focus on just “one” user – the googlebot.
Needs site owners to change the URI scheme (or, at least, the fragments).
No flexibility in the specs for using a different fragment prefix.
This is like saying "we know you have no idea how to build rich user experiences and we don't expect you to change things so we will do our best to crawl everything”.