Natural Search Blog


Google’s Advice For Web 2.0 & AJAX Development

Yesterday, Google’s Webmaster Blog gave some great advice for Web 2.0 application developers in their post titled “A Spider’s View of Web 2.0“.

In that post, they recommend providing alternative navigation options on Ajaxified sites so that the Googlebot spider can index your site’s pages and also for users who may have certain dynamic functions disabled in their browsers. They also recommend designing sites with “Progressive Enhancement” — designing a site iteratively over time by beginning with the basics first. Start out with simple HTML linking navigation and then add on Javascript/Java/Flash/AJAX structures on top of that simple HTML structure.

Before the Google Webmaster team had posted those recommendations, I’d  published a little article early this week on Search Engine Land on the subject of how Web 2.0 and Map Mashup Developers neglect SEO basics. A month back, my colleague Stephan Spencer also wrote an article on how Web 2.0 is often search-engine-unfriendly and how using Progressive Enhancement can help make Web 2.0 content findable in search engines like Google, Yahoo!, and Microsoft Live Search.

Way earlier than both of us even, our colleague, P.J. Fusco wrote an article for ClickZ on How Web 2.0 Affects SEO Strategy back in May.

We’re not just recycling each other’s work in all this — we’re each independently convinced of how problematic Web 2.o site design can limit a site’s performance traffic-wise. If your pages don’t get indexed by the search engines, there’s a far lower chance of users finding your site. With just a mild amount of additional care and work, Web 2.0 developers can optimize their applications, and the benefits are clear. Wouldn’t you like to make a little extra money every month on ad revenue? Better yet, how about if an investment firm or a Google or Yahoo were to offer you millions for your cool mashup concept?!?

But, don’t just listen to all the experts at Netconcepts — Google’s confirming what we’ve been preaching for some time now.

3 comments for Google’s Advice For Web 2.0 & AJAX Development »

  1. MyAvatars 0.2

    This is a good article and I agree but it doesn’t say how a 2.0 site doesn’t get indexed. What are they doing differently to get skipped by the indexing? Lacking META tags?

    Comment by Andy — 11/9/2007 @ 10:23 am


  2. MyAvatars 0.2

    Many of these sites do not have crawlable links going down into each unique piece of their content. And, they frequently only deliver the content through Javascript/Java/Flash.

    Search engines need text primarily, still, and they need to have sites constructed in such a way that they can crawl through regular hyperlinks to access it.

    Comment by Chris Silver Smith — 11/9/2007 @ 4:02 pm


  3. MyAvatars 0.2

    It is just too easy to jump right to the mashup stage with today’s tools. In my mind, that is the problem.

    I am not trying to claim perfection, but my process is always:

    1. Take the design and carve out the main elements (i.e. we’ll need lists, vertical navigation, heading, subheading, and paragraphs)

    2. Design the page without CSS or JS, using just good semantic markup (H1, et al). At this point you’ve catered to the lowest common denominator (the bot). The important part here: links. Each link should point to something, not only activate a script. You can use JavaScript later to change the behavior of the link. But for the bot (or the person without JS) you’ve now made sure your content is 100% available.

    3. Start styling the page with CSS. You’ll probably need to add SPANs and DIVs to realize the design most likely, but that’s fine, as it doesn’t materially affect the look of the unstyled page. Don’t hide objects that you’ll want a bit of JavaScript to later show. If the user doesn’t have JS activated, they may never see it!

    4. Use window.onload to load your JavaScript. Walk the DOM, add “onclick” events and “return false;” to any links you want to activate a script (insetad of linking to another page). Hide/animate/etc.

    5. Profit (?)

    It is definitely not as easy (or fun) as just tossing together a bunch of scripts and creating something neat, but it does guarantee that as you peel back each layer of web tech (behavior, style) you still have a usable site.

    Comment by Nathan Ziarek — 2/27/2008 @ 8:48 am


Leave a comment

* Do not use spammy names!

RSS feed for comments on this post. TrackBack URI

RSS Feeds
Categories
Archives
2013
Feb      
2011
May      
2010
Jan Feb Mar Apr
Sep      
2009
Jan Feb Apr May
Jun Jul Aug Sep
Oct Nov Dec  
2008
Jan Feb Mar Apr
May Jun Jul Aug
Sep Oct Dec  
2007
Jan Feb Mar Apr
May Jun Jul Aug
Sep Oct Nov Dec
2006
Mar Apr May Jun
Jul Aug Sep Oct
Nov Dec    
2005
Jan Feb Mar Dec
2004
May Jun Jul Aug
Sep Oct Nov Dec
Other

web hosts reviews cheap web hosting reviews how to build muscle for women symptoms of depression in women painkiller addiction how to get rid of depression drug addiction