Natural Search Blog


Is Web 2.0 Bogus as a Business Model?

For all of us who survived the exuberance of the dot-bombs era, a little blog entry by my friend, Randy Weber, entitled “Business Development 2.0 is BS 1.0” could serve as a nice dose of logic amid all the hype going on.

In his post, he lists out some salient points on why some of the Web 2.0 companies’ business plans are woefully lacking. It makes some sense. After all, if you’re sharing your core content and intellectual property (in the form of free apps), anyone can replicate your site, and they might just do it better than you, and you won’t see any of the money they’re making!

Web 2.0 maybe doesn’t have a simple definition beyond “I know it when I experience it”, but at base it seems to be founded on user-generated content and syndication of that content along with application services. So, businesses based on Web 2.0 would seem to be counter-intuitive to the normal mechanics of classic biz, and those businesses founded on it might have a lot more risk associated with them.

I think there are just a few caveats to Randy’s post, though. Read on and I’ll explain.

(more…)

Tips for Local Search Engine Optimization for Your Site

Increasingly, businesses are becoming aware of Local Search, and how optimizing for this channel is vital for those that have local outlets. Each of the main search engines has focused effort on their local search tools as the best strategy for continuing growth in online advertising, and the subject has become sufficiently important enough to merit a special Search Engine Strategies Conference devoted to the subject tomorrow in Denver. The importance of Local Search is further underscored by stats issued in a press release today by comScore, showing that Local Search continues to gain in marketshare.

So, how exactly could one optimize towards Local Search?

Read on and I’ll outline a few key tips.
(more…)

Governor Rick Perry & H. Ross Perot open Nanotech Conference, nanoTX ’06

Yesterday morning, Governor Rick Perry and H. Ross Perot each gave opening remarks at the nanoTX nanotechnology conference here in Dallas.

Kelly Kordzik (President of the board of the Texas Nanotechnology Initiative organization), and Rich Templeton (CEO of Texas Instruments) each gave some introductory remarks. Templeton’s talk was interesting, touching on work done by TI, and on how Nanotech is still in its infancy and very much dependent upon government investment/support. Governor Rick Perry spoke for a short while, mainly trying to hype up his support of the Texas Emerging Technology Fund which has enabled a lot of research in the field of nanotech, enabling Texas to be on the leading end of the work, worldwide.

Read on for more details…

(more…)

Using Flickr for Search Engine Optimization

I’ve previously blogged about optimization for Image Search. But, images can also be used for optimization for regular web search as well. Where online promotion is concerned, it appears to be an area for advantage which remains largely untapped. Many pros focus most of their optimization efforts towards the more popular web search results, and don’t realize that optimizing for image search can translate to good overall SEO.

Flickr is one of the most popular image sharing sites in the world, with loads of features that also make it qualify as a social networking site. Flickr’s popularity, structure and features also make it an ideal vehicle for search engine optimization. So, how can image search optimization be done through Flickr? Read on, and I’ll outline some key steps to take. (more…)

Putting Keywords in Your URLs

Recently Matt Cutts blogged that:

doing the query [site:windowslivewriter.spaces.live.com] returns some urls like windowslivewriter.spaces.live.com /Blog/cns!D85741BB5E0BE8AA!174.entry . In general, urls like that sometimes look like session IDs to search engines. Most bloggy sites tend to have words from the title of a post in the url; having keywords from the post title in the url also can help search engines judge the quality of a page.

He then clarified his statement above, in the comments of that post:

Tim, including the keyword in the url just gives another chance for that keyword to match the user’s query in some way. That’s the way I’d put it.

What does this mean? It means that from Google’s perspective, keywords in your URLs are a useful thing to have. It’s another “signal” and can provide ranking benefits.

How should you separate these keywords? Not with underscores, that’s for sure. Matt Cutts has previously gone on the record to say that Google does not treat underscores as word separators. Use hyphens instead. Or plus signs would be okay too.

Also, I’d avoid too many hyphens in the URL, as that can look spammy. Try to keep it to three or fewer. Unless your site is powered by WordPress, in which case Google probably makes an exception for that, given how popular it is and how many legitimate bloggers have loads of hyphens in their permalink URLs. By the way, you can trim those down using the Slug Trimmer plugin for WordPress.

тренировки

To Use Sitemaps, or Not To Use Sitemaps, That’s the Question

It was really great when Google launched its Sitemaps (recently renamed to Webmaster Tools, as part of their Webmaster Central utilities) – when that happened it was a really great indication of a new time where technicians who wished to help make their pages findable would not automatically be considered “evil” and the SEs might provide tools to help technicians disclose their pages directly. Yahoo soon followed with their own tools, named Yahoo! Site Explorer, and surely MSN will bow to peer pressure with their own submission system and tools.

Initially, I thought that there wasn’t significant advantage to me for using these systems, because I’d already developed good methods for providing our page links to the search engines through the natural linking found in our site navigation systems.

Why should I expend yet more time and resources to dynamically produce the link files?

(more…)

Robots Meta Tag Not Well Documented by Search Engines

Those of us who do SEO have been increasingly pleased with the various search engines for providing or allowing tools and protocols to allow us to help direct, control, and manage how our sites are indexed. However, the search engines still have a significant need to keep much of their workings a secret out of fear of being exploited by ruthless black-hats who will seek to improve page rankings for keywords regardless of appropriateness. This often leaves the rest of us with tools that can be used in some limited cases, but there’s little or no documentation to tell us how those tools operate functionally in the complex real world. The Robots META tag is a case in point.

The idea behind the protocol was simple, and convenient. It’s sometimes hard to use a robots.txt file to manage all the types of pages delivered up by large, dynamic sites. So, what could be better than using a tag directly on a page to tell the SE whether to spider and index the page or not?  Here’s how the tag should look, if you wanted a page to NOT be indexed, and for links found on it to NOT be crawled:

<meta content=”noindex,nofollow” name=”ROBOTS”>

Alternatively, here’s the tag if you wanted to expressly tell the bot to index the page and crawl the links on it:

<meta content=”index,follow” name=”ROBOTS”>

But, what if you wanted the page to not be indexed, while you still wanted the links to be spidered? Or, what if you needed the page indexed, but the links not followed? The major search engines don’t clearly describe how they treat these combinations, and the effects may not be what you’d otherwise expect. Read on and I’ll explain how using this simple protocol with the odd combos had some undesirable effects.

(more…)

Flickr Adds Geotagging Features

In a move that proves that the people behind flickr are still channeling the Web 2.0 mass conscious, flickr announced this week that they’re adding Geotagging features to their already-robust suite of image management products.

As you may recall, I previously blogged a bit about the rise of geotagging, particularly geotagging of photos, and I had said that it seemed to be a really strong idea with a lot of potential uses. It’s gratifying to see that a service like flickr (and a company like Yahoo!) also believes that it will be strategically beneficial.

The number of people who have been geotagging or who even know about it is likely a relatively low percentage of the online populace, I’d guess (partly for the reason that most people don’t have a GPS device to tell them a location’s longitude and latitude). Now that a top-ranked photo site is supporting it expressly, droves of users will become educated about it, and experiment with it. By doing this, flickr is propelling the trend into the mainstream, increasing the likelihood that it’ll be more widely adopted.

Flickr’s new geotagging utilities were built by mashing-up their image management utilities with Yahoo! Maps, allowing users to drag pix onto a mapped location of where the image was taken in order to associate the photo with the geotag. Also, it appears that users could now use a graphic map as a navigational interface to browse geographic locations and then pull up any publicly-available photos associated with that location. Read on for more info.

(more…)

RSS Feeds
Categories
Archives
Other