- Natural Search Blog - http://www.naturalsearchblog.com -

Google’s Advice For Web 2.0 & AJAX Development

Posted By Chris On 11/7/2007 @ 3:47 pm In Best Practices,Google,Search Engine Optimization,SEO | 3 Comments

Yesterday, Google’s Webmaster Blog gave some great advice for Web 2.0 application developers in their post titled “A Spider’s View of Web 2.0 [1]“.

In that post, they recommend providing alternative navigation options on Ajaxified sites so that the Googlebot spider can index your site’s pages and also for users who may have certain dynamic functions disabled in their browsers. They also recommend designing sites with “Progressive Enhancement” — designing a site iteratively over time by beginning with the basics first. Start out with simple HTML linking navigation and then add on Javascript/Java/Flash/AJAX structures on top of that simple HTML structure.

Before the Google Webmaster team had posted those recommendations, I’d  published a little article early this week on Search Engine Land on the subject of how Web 2.0 and Map Mashup Developers neglect SEO basics [2]. A month back, my colleague Stephan Spencer also wrote an article on how Web 2.0 is often search-engine-unfriendly [3] and how using Progressive Enhancement can help make Web 2.0 content findable in search engines like Google, Yahoo!, and Microsoft Live Search.

Way earlier than both of us even, our colleague, P.J. Fusco wrote an article for ClickZ on How Web 2.0 Affects SEO Strategy [4] back in May.

We’re not just recycling each other’s work in all this — we’re each independently convinced of how problematic Web 2.o site design can limit a site’s performance traffic-wise. If your pages don’t get indexed by the search engines, there’s a far lower chance of users finding your site. With just a mild amount of additional care and work, Web 2.0 developers can optimize their applications, and the benefits are clear. Wouldn’t you like to make a little extra money every month on ad revenue? Better yet, how about if an investment firm or a Google or Yahoo were to offer you millions for your cool mashup concept?!?

But, don’t just listen to all the experts at Netconcepts — Google’s confirming what we’ve been preaching for some time now.


3 Comments (Open | Close)

3 Comments To "Google’s Advice For Web 2.0 & AJAX Development"

#1 Comment By Andy On 11/9/2007 @ 10:23 am

This is a good article and I agree but it doesn’t say how a 2.0 site doesn’t get indexed. What are they doing differently to get skipped by the indexing? Lacking META tags?

#2 Comment By Chris Silver Smith On 11/9/2007 @ 4:02 pm

Many of these sites do not have crawlable links going down into each unique piece of their content. And, they frequently only deliver the content through Javascript/Java/Flash.

Search engines need text primarily, still, and they need to have sites constructed in such a way that they can crawl through regular hyperlinks to access it.

#3 Comment By Nathan Ziarek On 2/27/2008 @ 8:48 am

It is just too easy to jump right to the mashup stage with today’s tools. In my mind, that is the problem.

I am not trying to claim perfection, but my process is always:

1. Take the design and carve out the main elements (i.e. we’ll need lists, vertical navigation, heading, subheading, and paragraphs)

2. Design the page without CSS or JS, using just good semantic markup (H1, et al). At this point you’ve catered to the lowest common denominator (the bot). The important part here: links. Each link should point to something, not only activate a script. You can use JavaScript later to change the behavior of the link. But for the bot (or the person without JS) you’ve now made sure your content is 100% available.

3. Start styling the page with CSS. You’ll probably need to add SPANs and DIVs to realize the design most likely, but that’s fine, as it doesn’t materially affect the look of the unstyled page. Don’t hide objects that you’ll want a bit of JavaScript to later show. If the user doesn’t have JS activated, they may never see it!

4. Use window.onload to load your JavaScript. Walk the DOM, add “onclick” events and “return false;” to any links you want to activate a script (insetad of linking to another page). Hide/animate/etc.

5. Profit (?)

It is definitely not as easy (or fun) as just tossing together a bunch of scripts and creating something neat, but it does guarantee that as you peel back each layer of web tech (behavior, style) you still have a usable site.


Article printed from Natural Search Blog: http://www.naturalsearchblog.com

URL to article: http://www.naturalsearchblog.com/archives/2007/11/07/googles-advice-for-web-20-ajax-development/

URLs in this post:

[1] A Spider’s View of Web 2.0: http://googlewebmastercentral.blogspot.com/2007/11/spiders-view-of-web-20.html

[2] Web 2.0 and Map Mashup Developers neglect SEO basics: http://searchengineland.com/071105-125557.php

[3] Web 2.0 is often search-engine-unfriendly: http://searchengineland.com/071018-074610.php

[4] How Web 2.0 Affects SEO Strategy: http://www.clickz.com/showPage.html?page=3625943