Natural Search Blog


Inbound Deep Links Benefit Page Rank Distribution Sitewide

Many a time, you would have come across sites (especially the large ones) where the deeper you dig into the site hierarchy, you can see the Pagerank toolbar grayed out or having a value 0. In general, the home page is the starting point for a website and it accrues the maximum Page rank.

The entire domain’s authority and trust is reflected by this page rank value. The home page then tends to distribute this page rank to the first level (categories), the second level (sub-categories) and the third level product pages which we often refer to as link juice. In general, the first level pages tend to derive the maximum link juice from the home page. But in a site with excessive number of sub-categories and product pages (money pages), the pagerank distribution is not proportional with some gaining link juice and a large majority not gaining any.

(more…)

Amazon’s Secret to Dominating SERP Results

Many e-tailers have looked with envy at Amazon.com’s sheer omnipresence within the search results on Google. Search for any product ranging from new book titles, to new music releases, to home improvement products, to even products from their new grocery line, and you’ll find Amazon links garnering page 1 or 2 rankings on Google and other engines. Why does it seem like such an unfair advantage?

Can you keep a secret? There is an unfair advantage. Amazon is applying conditional 301 URL redirects through their massive affiliate marketing program.

Most online merchants outsource the management and administration of their affiliate program to a provider who tracks all affiliate activity, using special tracking URLs. These URLs typically break the link association between affiliate and merchant site pages. As a result, most natural search traffic comes from brand related keywords, as opposed to long tail keywords. Most merchants can only imagine the sudden natural search boost they’d get from their tens of thousands of existing affiliate sites deeply linking to their website pages with great anchor text. But not Amazon!

Amazon’s affiliate (“associate”) program is fully integrated into the website. So the URL that you get by clicking from Guy Kawasaki’s blog for example to buy one of his favorite books from Amazon doesn’t route you through a third party tracking URL, as would be the case with most merchant affilate programs. Instead, you’ll find it links to an Amazon.com URL (to be precise: http://www.amazon.com/exec/obidos/ASIN/0060521996/guykawasakico-20), with the notable associate’s name at the end of the URL so Guy can earn his commission.

However, refresh that page with your browser’s Googlebot User Agent detection turned on, and you’ll see what Googlebot (and others) get when they request that same URL: http://www.amazon.com/Innovators-Dilemma-Revolutionary-Business-Essentials/dp/0060521996 delivered via a 301 redirect script. That’s the same URL that shows up in Google when you search for this book title.

So if you are a human coming in from affiliate land, you get one URL used to track your referrer’s commission. If you are a bot visiting this URL, you are told these URLs now redirect to the keyword URLs. In this way, Amazon is able to have its cake and eat it too – provide an owned and operated affiliate management system while harvesting the PageRank from millions of deep affiliate backlinks to maximize their ranking visibility in your long tail search query.

(Note I’ve abstained from hyperlinking these URLs so bots crawling this content do not further entrench Amazon’s ranking on these URLs, although they are already #4 in the query above!).

So is this strategy ethical? Conditional redirects are a no-no because it sends mixed signals to the engine – is the URL permanently moved or not? If it is, but only for bots, then you are crossing the SEO line. But in Amazon’s case it appears searchers as well as general site users also get the keyword URL, so it is merely the affiliate users that get an “old” URL. If that’s the case across the board, it would be difficult to argue Amazon is abusing this concept, but rather have cleverly engineered a solution to a visibility problem that other merchants would replicate if they could. In fact, from a searcher perspective, were it not for Amazon, many long tail product queries consumers conduct would return zero recognizable retail brands to buy from, with all due respect to PriceGrabber, DealTime, BizRate, NexTag, and eBay.

As a result of this long tail strategy, I’d speculate that Amazon’s natural search keyword traffic distribution looks more like 40/60 brand to non-brand, rather than the typical 80/20 or 90/10 distribution curve most merchants (who lack affiliate search benefits) receive.

Brian

imitrex prices
danabolan

GravityStream Does Local SEO: Now Fixes Store Locator Pages

I’m pleased to announce that GravityStream can now optimize store locator pages for those retailer sites which provide search utilities for their local outlets.

GravityStream Compass Rose

As you may recall, I’ve written before about how dealer locators are terribly optimized and how store locator pages can be optimized. A great many store locator sections of major corporate sites are not allowing search engine spiders to properly crawl through and index all the locations where they may have brick-and-mortar outlets.

Most large companies seem fairly unaware that their store locators are effectively blocking search engine spiders and are making it impossible for endusers to find their locations through simple keyword searches. I’ve also listed out a number of top store locator providers which produce locational services like this for many Internet Retailer 500 companies.

Read on for details on our results…

(more…)

Advice on Subdomains vs. Subdirectories for SEO

Matt Cutts recently revealed that Google is now treating subdomains much more like subdirectories of a domain — in the sense that they wish to limit how many results show up for a given keyword search from a single site. In the past, some search marketers attempted to use keyworded subdomains as a method for improving search referral traffic from search engines — deploying out many keyword subdomains for terms for which they hoped to rank well.

Not long ago, I wrote an article on how some local directory sites were using subdomains in an attempt to achieve good ranking results in search engines. In that article, I concluded that most of these sites were ranking well for other reasons not directly related to the presence of the keyword as a subdomain — I showed some examples of sites which ranked equally well or better in many cases where the keyword was a part of the URI as opposed to the subdomain. So, in Google, subdirectories were already functioning just as well as subdomains for the purposes of keyword rank optimization. (more…)

Dealer Locator & Store Locator Services Need to Optimize

Store LocatorsMy article on local SEO for store locators just published on Search Engine Land, and any company that has a store locator utility ought to read it. Many large companies provide a way for users to find their local stores, dealers, or authorized resellers. The problem is that these sections are usually hidden from the search engines behind search submission forms, javascripted links, html frames, and Flash interfaces.

For many national or regional chain stores, providing dealer-locator services with robust maps, driving directions and proximity search capability is outside of their core competencies, and they frequently choose to outsource that development work or purchase software to enable the service easily.

I did a quick survey and found a number of companies providing dealer locator or store finder functionality: (more…)

Double Your Trouble: Google Highlights Duplication Issues

Maile Ohye posted a great piece on Google Webmaster Central on the effects of duplicate content as caused by common URL parameters. There is great information in that post, not least of which it validates exactly what a few of us have stated for a while: duplication should be addressed because it can water down your PageRank.

Double Trouble: Duplicate Content Problems

Maile suggests a few ways of addressing dupe content, and she also reveals a few details of Google’s workings that are interesting, including: (more…)

Automatic Search Engine Optimization through GravityStream

I’ve had a lot of questions about my new work since I joined Netconcepts a little over three months ago as their Lead Strategist for their GravityStream product/service. My primary role is to bring SEO guidance to clients using GravityStream, and to provide thought leadership to the ongoing development of the product and business.

GravityStream

GravityStream is a technical solution that provides outsourced search optimization to large, dynamic websites. Automatic SEO, if you will. Here’s what it does…

(more…)

Subdomains for Local Directory Sites?

Earlier this week, my column on “Domaining & Subdomaining in the Local Space – Part 1” went live at Search Engine Land. In it, I examine how a number of local business directory sites are using subdomains with the apparent desire to get extra keyword ranking value from them. Typically, they will pass the names of cities in the third-level-domain names (aka “subdomains”). Some sites doing that include:

In that installment, I conclude that the subdomaining for the sake of keyword ranking has no real benefit.

This assertion really can be extended out to all other types of sites as well, since the ranking criteria that the search engines use is not limited to only local info sites. Keywords in subdomains really have no major benefit.

SEO firms used to suggest that people deploy their content out onto “microsites” for all their keywords – a different domain name to target each one. This just isn’t a good strategy, really. Focus on improving the quality of content for each keyword, founded on its own page, and work on your link-building efforts (quality link-building, not unqualified bad-quality links). Tons of keyword domains or subdomains is no quick solution for ranking well.

trenabol

Podcasts of Neil Patel, Eric Ward, and Vanessa Fox

I’ve been interviewing speakers of the AMA’s Hot Topic: Search Engine Marketing events taking place April 20th in San Francisco, May 25th in NYC, and June 22 in Chicago (all three of which I will be chairing). I had fascinating and insightful conversations with link builder extraordinaire Eric Ward, Googler Vanessa Fox, and social media marketing guru Neil Patel. There’s some real gold in those interviews.

Download/Listen:

More podcasts to come from other speakers, so be sure to subscribe to the RSS feed so you don’t miss them. Also be sure to register for the conference at one of the three cities, it’ll be great!

deca 300

Dupe Content Penalty a Myth, but Negative Effects Are Not

I was interested to read a column by Jill Whalen this past week on “The Duplicate Content Penalty Myth” at Search Engine Land. While I agree with her assessment that there really isn’t a Duplicate Content Penalty per se, I think she perhaps failed to address one major issue affecting websites in relation to this.

Read on to see what I mean.

(more…)

RSS Feeds
Categories
Archives
Other