Natural Search Blog


GravityStream Does Local SEO: Now Fixes Store Locator Pages

I’m pleased to announce that GravityStream can now optimize store locator pages for those retailer sites which provide search utilities for their local outlets.

GravityStream Compass Rose

As you may recall, I’ve written before about how dealer locators are terribly optimized and how store locator pages can be optimized. A great many store locator sections of major corporate sites are not allowing search engine spiders to properly crawl through and index all the locations where they may have brick-and-mortar outlets.

Most large companies seem fairly unaware that their store locators are effectively blocking search engine spiders and are making it impossible for endusers to find their locations through simple keyword searches. I’ve also listed out a number of top store locator providers which produce locational services like this for many Internet Retailer 500 companies.

Read on for details on our results…

(more…)

Advice on Subdomains vs. Subdirectories for SEO

Matt Cutts recently revealed that Google is now treating subdomains much more like subdirectories of a domain — in the sense that they wish to limit how many results show up for a given keyword search from a single site. In the past, some search marketers attempted to use keyworded subdomains as a method for improving search referral traffic from search engines — deploying out many keyword subdomains for terms for which they hoped to rank well.

Not long ago, I wrote an article on how some local directory sites were using subdomains in an attempt to achieve good ranking results in search engines. In that article, I concluded that most of these sites were ranking well for other reasons not directly related to the presence of the keyword as a subdomain — I showed some examples of sites which ranked equally well or better in many cases where the keyword was a part of the URI as opposed to the subdomain. So, in Google, subdirectories were already functioning just as well as subdomains for the purposes of keyword rank optimization. (more…)

Google Hiding Content Behind an Image on their SERPs

Tamar Weinberg at Search Engine Roundtable reports that in a Google Groups forum, a Webmaster Central team member stated that you could use something like the z-index attribute in DHTML styles to hide text or links behind an image, so long as the text/link being hidden is what’s represented in the image.

I think it’s a good thing that they do allow this sort of use, because it appears to me that they’re doing this very thing on their own search results pages! If you refresh a search page, you can see what they’re hiding under their own logo:

Google hides textlink behind logo
(click to enlarge)

…a text link pointing to their homepage.

Now, the interesting question I’d have for the Google team about this would be: this is straightforward if the image itself contains text, but what would be allowable if the image doesn’t contain text, but say, an image of a lion? There’s many different ways to express what that lion is from “lion” to “tawny, golden-furred lion king”.

Or, should we be assuming that images that are written over text and links are only allowable when the image contains text?

The Google Webmaster Tools contributor states that you could be using image’s ALT and TITLE attributes to essentially do the same thing. This is sorta funny, because one could say the same thing of Google’s use of this on their own page — why are they doing it?

One immediately wonders how Google polices this, since they’re apparently not frowning upon pages drawing images over text/links in all cases. They can detect text written over images, but would they have every instance checked by a human? Or, are they using optical character recognition algos to automatically check the text within images against the text being hidden?

In any case, the fact that Google is doing this on their own site could be taken as more confirmation that they don’t consider the technique to be bad in of itself — as long as the practice is conservative and the text/link just describes the text content within the image.

nandrolone decanoate

Google’s Advice For Web 2.0 & AJAX Development

Yesterday, Google’s Webmaster Blog gave some great advice for Web 2.0 application developers in their post titled “A Spider’s View of Web 2.0“.

In that post, they recommend providing alternative navigation options on Ajaxified sites so that the Googlebot spider can index your site’s pages and also for users who may have certain dynamic functions disabled in their browsers. They also recommend designing sites with “Progressive Enhancement” — designing a site iteratively over time by beginning with the basics first. Start out with simple HTML linking navigation and then add on Javascript/Java/Flash/AJAX structures on top of that simple HTML structure.

Before the Google Webmaster team had posted those recommendations, I’d  published a little article early this week on Search Engine Land on the subject of how Web 2.0 and Map Mashup Developers neglect SEO basics. A month back, my colleague Stephan Spencer also wrote an article on how Web 2.0 is often search-engine-unfriendly and how using Progressive Enhancement can help make Web 2.0 content findable in search engines like Google, Yahoo!, and Microsoft Live Search.

Way earlier than both of us even, our colleague, P.J. Fusco wrote an article for ClickZ on How Web 2.0 Affects SEO Strategy back in May.

We’re not just recycling each other’s work in all this — we’re each independently convinced of how problematic Web 2.o site design can limit a site’s performance traffic-wise. If your pages don’t get indexed by the search engines, there’s a far lower chance of users finding your site. With just a mild amount of additional care and work, Web 2.0 developers can optimize their applications, and the benefits are clear. Wouldn’t you like to make a little extra money every month on ad revenue? Better yet, how about if an investment firm or a Google or Yahoo were to offer you millions for your cool mashup concept?!?

But, don’t just listen to all the experts at Netconcepts — Google’s confirming what we’ve been preaching for some time now.

darvocet retail price
buy nandrolone decanoate

Am I an SEO Dog? More On Toasting of Internet Yellow Pages

Donna Bogatin apparently disagreed with my article at SEL entitled “Google Trends: Yellow Pages Will Be Toast in Four Years“, posting a bit of a lurid headline herself: “Yellow Pages Trash Talking: The SEO Dog in the Google Local Fight“.

I didn’t really think that my article was quite “trash talk“, and I’m assuming from the article content that the “SEO Dog” referred to was perhaps myself, or perhaps the “dog” is my article conclusions, fighting for the ostensibly narrow viewpoint of all SEOs. Aside from the somewhat scathing disembowelment attempted, I thought it’d be informative for me to address some of the logic-faulty conclusions that were drawn.

(more…)

Search Engine Optimization through Yellow Pages

Yellow Pages & SEOThere’s an interesting thread that appeared on Greg Sterling’s blog on Using IYPs as an SEO Strategy.

Some of the commentators pointed out that yellow pages ads are pretty costly, compared with those of the search engines. So, is using yellow pages as part of a search marketing campaign worthwhile for traffic and good for ROI? My answer is: Yes, yellow pages can and should be used as a major component of local search optimization. Yellow pages can be used for SEO, and here’s some details on how to approach it.

(more…)

Yes, you can automate SEO – we’ve done it!

Loren Baker at Search Engine Journal wrote a post highlighting Commerce360’s stated intention to build automatic optimization software, using a lot of venture capital they raised for this purpose. Loren asks, “Can SEO Be Automated?”

Inspired by this thread, Lisa Barone at Bruce Clay, Inc. responds with “You Can’t Automate Search Engine Optimization” (which is just the tiniest bit ironic, since Bruce Clay’s Dynamic Site Mapping tool arguably provides a level of automated search optimization).

While Commerce360 is looking to create search optimization automation, we’ve already been accomplishing it for quite some time here at Netconcepts, as I outlined in an earlier article on Automatic Search Engine Optimization. So, do I think SEO can be automated? Hell, yes!

(more…)

Using Flickr to Optimize for Yahoo Image Search

Google Blogoscoped reports that Yahoo’s Image Search now particularly likes Flickr content, so this may be incentive for webmasters to use Flickr “as a kind of Yahoo search engine optimization”. My frequent readers know that I’ve been advocating using Flickr for image search optimization for some time now, and I’ve been speaking on this subject at Search Engine Strategies conferences as well.

The Blogoscoped mention of Yahoo’s love for Flickr content is particularly timely, since Yahoo! announced back in June that they were permanently shutting down Yahoo! Photos in favor of their Flickr property, and the final closing date is tomorrow, September 20th.

Previously, I’d railed a bit against Yahoo! because I’d seen a lot of evidence that they didn’t spider/index Flickr content as well or comprehensively as Google did — altogether ironic since Yahoo owns Flickr. Just as with the anecdotal reports in the Blogoscoped post, I’m seeing nice indications that my earlier criticism of Yahoo’s lack of inclusion of Flickr content may now be completely resolved. (more…)

Double Your Trouble: Google Highlights Duplication Issues

Maile Ohye posted a great piece on Google Webmaster Central on the effects of duplicate content as caused by common URL parameters. There is great information in that post, not least of which it validates exactly what a few of us have stated for a while: duplication should be addressed because it can water down your PageRank.

Double Trouble: Duplicate Content Problems

Maile suggests a few ways of addressing dupe content, and she also reveals a few details of Google’s workings that are interesting, including: (more…)

Matt Cutts reveals underscores now treated as word separators in Google

After the recent WordCamp conference, Stephan Spencer reports here and here that Matt Cutts stated that Google now treats underscores as white-space characters or word separators when interpreting URLs. Read on for more details and my take on it…

(more…)

RSS Feeds
Categories
Archives
Other