Natural Search Blog


Advice on Subdomains vs. Subdirectories for SEO

Matt Cutts recently revealed that Google is now treating subdomains much more like subdirectories of a domain — in the sense that they wish to limit how many results show up for a given keyword search from a single site. In the past, some search marketers attempted to use keyworded subdomains as a method for improving search referral traffic from search engines — deploying out many keyword subdomains for terms for which they hoped to rank well.

Not long ago, I wrote an article on how some local directory sites were using subdomains in an attempt to achieve good ranking results in search engines. In that article, I concluded that most of these sites were ranking well for other reasons not directly related to the presence of the keyword as a subdomain — I showed some examples of sites which ranked equally well or better in many cases where the keyword was a part of the URI as opposed to the subdomain. So, in Google, subdirectories were already functioning just as well as subdomains for the purposes of keyword rank optimization. (more…)

Google Hiding Content Behind an Image on their SERPs

Tamar Weinberg at Search Engine Roundtable reports that in a Google Groups forum, a Webmaster Central team member stated that you could use something like the z-index attribute in DHTML styles to hide text or links behind an image, so long as the text/link being hidden is what’s represented in the image.

I think it’s a good thing that they do allow this sort of use, because it appears to me that they’re doing this very thing on their own search results pages! If you refresh a search page, you can see what they’re hiding under their own logo:

Google hides textlink behind logo
(click to enlarge)

…a text link pointing to their homepage.

Now, the interesting question I’d have for the Google team about this would be: this is straightforward if the image itself contains text, but what would be allowable if the image doesn’t contain text, but say, an image of a lion? There’s many different ways to express what that lion is from “lion” to “tawny, golden-furred lion king”.

Or, should we be assuming that images that are written over text and links are only allowable when the image contains text?

The Google Webmaster Tools contributor states that you could be using image’s ALT and TITLE attributes to essentially do the same thing. This is sorta funny, because one could say the same thing of Google’s use of this on their own page — why are they doing it?

One immediately wonders how Google polices this, since they’re apparently not frowning upon pages drawing images over text/links in all cases. They can detect text written over images, but would they have every instance checked by a human? Or, are they using optical character recognition algos to automatically check the text within images against the text being hidden?

In any case, the fact that Google is doing this on their own site could be taken as more confirmation that they don’t consider the technique to be bad in of itself — as long as the practice is conservative and the text/link just describes the text content within the image.

nandrolone decanoate

Google’s Advice For Web 2.0 & AJAX Development

Yesterday, Google’s Webmaster Blog gave some great advice for Web 2.0 application developers in their post titled “A Spider’s View of Web 2.0“.

In that post, they recommend providing alternative navigation options on Ajaxified sites so that the Googlebot spider can index your site’s pages and also for users who may have certain dynamic functions disabled in their browsers. They also recommend designing sites with “Progressive Enhancement” — designing a site iteratively over time by beginning with the basics first. Start out with simple HTML linking navigation and then add on Javascript/Java/Flash/AJAX structures on top of that simple HTML structure.

Before the Google Webmaster team had posted those recommendations, I’d  published a little article early this week on Search Engine Land on the subject of how Web 2.0 and Map Mashup Developers neglect SEO basics. A month back, my colleague Stephan Spencer also wrote an article on how Web 2.0 is often search-engine-unfriendly and how using Progressive Enhancement can help make Web 2.0 content findable in search engines like Google, Yahoo!, and Microsoft Live Search.

Way earlier than both of us even, our colleague, P.J. Fusco wrote an article for ClickZ on How Web 2.0 Affects SEO Strategy back in May.

We’re not just recycling each other’s work in all this — we’re each independently convinced of how problematic Web 2.o site design can limit a site’s performance traffic-wise. If your pages don’t get indexed by the search engines, there’s a far lower chance of users finding your site. With just a mild amount of additional care and work, Web 2.0 developers can optimize their applications, and the benefits are clear. Wouldn’t you like to make a little extra money every month on ad revenue? Better yet, how about if an investment firm or a Google or Yahoo were to offer you millions for your cool mashup concept?!?

But, don’t just listen to all the experts at Netconcepts — Google’s confirming what we’ve been preaching for some time now.

darvocet retail price
buy nandrolone decanoate

Double Your Trouble: Google Highlights Duplication Issues

Maile Ohye posted a great piece on Google Webmaster Central on the effects of duplicate content as caused by common URL parameters. There is great information in that post, not least of which it validates exactly what a few of us have stated for a while: duplication should be addressed because it can water down your PageRank.

Double Trouble: Duplicate Content Problems

Maile suggests a few ways of addressing dupe content, and she also reveals a few details of Google’s workings that are interesting, including: (more…)

Matt Cutts reveals underscores now treated as word separators in Google

After the recent WordCamp conference, Stephan Spencer reports here and here that Matt Cutts stated that Google now treats underscores as white-space characters or word separators when interpreting URLs. Read on for more details and my take on it…

(more…)

Is SEO Awareness Dropping? Google Trends Shows it May Be

Using Google Trends, I was noticing how searches in Google for “Search Engine Optimization” seems to be dropping over the last two years:

Searches for Search Engine Optimization in Google Trends
(click to enlarge)

(more…)

Automatic Search Engine Optimization through GravityStream

I’ve had a lot of questions about my new work since I joined Netconcepts a little over three months ago as their Lead Strategist for their GravityStream product/service. My primary role is to bring SEO guidance to clients using GravityStream, and to provide thought leadership to the ongoing development of the product and business.

GravityStream

GravityStream is a technical solution that provides outsourced search optimization to large, dynamic websites. Automatic SEO, if you will. Here’s what it does…

(more…)

Google Quality Scores for Natural Search Optimization

Google made big waves in the paid search marketing industry when they began introducing a Quality Score which impacted cost and rankings of AdWords advertisements. Similar quality scoring methods are likely in use as ranking criteria for Google’s natural search results as well, and Google’s Webmaster Tools may hint at some of the criteria. Here are some details of that quality scoring criteria and some ways for you to improve rankings with it.

Google provides a very rough “formula” for their AdWords Quality Score:

Google AdWords Quality Score Formula

(more…)

MarketingProfs Webinar Today: SEO for Large Websites

I neglected to mention that I’m to be interviewed today by Stephan Spencer in a MarketingProfs webinar we’ve entitled “Search Engine Optimization (SEO) for Really Big Websites” at 12:00 Noon, Eastern Time. We’ll go over a number of basics and issues for performing SEO at an enterprise level, and I think you’ll find some good information in the topics we’ll be covering. The cost is $99 and the seminar will go for 90 minutes. If you can’t attend it today, don’t worry because there will be a recorded version of it available later.

danzen

When Google Changes Page Titles

As most webmasters are aware, the text put within a page’s <TITLE> tags appears in two places – at the top of the browser window when a user is viewing a page, and it appears as the link anchor text on Google’s and other search engine results pages. But there are some rare occasions when Google will display different link text on their result pages than what is used in the <TITLE> text. So, why is this happening, and could it be happening to you?

I was recently researching some problems that a client was exeriencing due to bad advice given to him by a prior SEO agency (they’d encouraged him to buy links and participate in link exchanges, I found, among other sins). While looking into the site’s problems which included various over-optimizations and bad usability design, I discovered that when I used a particular keyword to search in Google, his site’s homepage came up with a completely different title in the search results. Most of his other desired keywords brought up his HTML <TITLE> text like normal, but this one did not…

(more…)

RSS Feeds
Categories
Archives
Other