Natural Search Blog


Welcome to Natural Search Blog

Natural Search Blog provides articles on search engine optimization including keyword reasearch, on-page factors, link-building, social media optimization, local search optimization, image search optimization, and mobile SEO.

In addition to natural search optimization topics, we also cover internet marketing, ecommerce, web design, usability, and technology.

Recent Entries

New WordPress Plugin for SEO

I’ve just released “SEO Title Tag”, a plugin for WordPress. As the name implies, it allows you to optimize your WordPress site’s title tags in ways not supported by the default WordPress installation. For example:

Get the plugin now: SEO Title Tag WordPress Plugin

I’d love your feedback, as this is my first WordPress plugin.

Enjoy!

Toolbar PageRank Update

Yep, it’s that time again.

I don’t usually care that much, but we had a little snafu with our PageRank readout on the toolbar for our netconcepts.com site due to a misconfiguration on our end (detailed on my post “Toolbar PageRank Update Is Currently Underway)”, and happily that’s now corrected.

web site design
buy clenbuterol uk paypal

Google Sitemaps Reveal Some of the Black Box

I earlier mentioned the recent Sitemaps upgrades which were announced in June, and how I thought these were useful for webmasters. But, the Sitemaps tools may also be useful in other ways beyond the obvious/intended ones.

The information that Google has made available in Sitemaps is providing a cool bit of intel on yet another one of the 200+ parameters or “signals” that they’re using to rank pages for SERPs.

For reference, check out the Page Analysis Statistics that are provided in Sitemaps for my “Acme” products and services experimental site:

Google Sitemaps Page Analysis

It seems unlikely to me that these stats on “Common Words” found “In your site’s content” were generated just for the sake of providing nice tools for us in Sitemaps. No, the more likely scenario would seem to be that Google was already collating the most-common words found on your site for their own uses, and then they later chose to provide some of these stats to us in Sitemaps.

This is significant, because we’ve already known that Google tracks keyword content for each page in order to assess its relevancy for search queries made with that term. But, why would Google be tracking your most-common keywords in a site-wide context?

One good explanation presents itself: Google might be tracking common terms used throughout a site in order to assess if that site should be considered authoritative for particular keywords or thematic categories.

Early on, algorithmic researchers such as Jon Kleinberg worked on methods by which “authoritative” sites and “hubs” could be identified. IBM and others did further research on authority/hub identification, and I heard engineers from Teoma speak on the importance of these approaches a few times at SES conferences when explaining the ExpertRank system their algorithms were based upon.

So, it’s not all that surprising that Google may be trying to use commonly-occuring text to help identify Authoritative sites for various themes. This would be one good automated method for classifying sites for subject matter categories and keywords.

The take-away concept is that Google may be using words found in the visible text throughout your site to assess whether you’re authoritative for particular themes or not.

 

Back from July 4th vacation

You may’ve noticed that I took a few days off from the grind to spend the long July 4th weekend with friends, swimming about the Guadalupe River at a ranch in Hunt, Texas.

If you’re curious about what I look at, you can check out this pic of me on flickr that my friend Suzanne took during our trip.

Now that I’m back and maybe starting to get acclimated in my new position at Verizon, I’m hoping to post here a bit more regularly.

primobolan

Click Fraud Costs Estimated at over $800M

In Report: Advertisers Cut Spending, Blame Google and Yahoo for Click Fraud, a new report states that advertisers wasted over $800 million last year on phony clicks.

Some points of interest:

I predict that this fraud perception will fuel advertisers increasing reliance on natural search, where click fraud is not incentivized.

Will click fraud be the catalyst that finally causes retailers to more equally allocate their spending between PPC (pay per click) and NSO (natural search optimization)? So, for example, shift from $1MM/yr PPC and $150k on NSO, to more like $1MM/yr PPC and $1MM/yr NSO?

As PPC gets more expensive, the act of click fraud gets more costly, and that bad apple must begin to spoil the bucket at some point – not completely I’m sure, but probably enough to cause advertisers to rethink allocation and importance of NSO.

buy ponstel
xanax online

How much is a text link ad worth?

Text-Link-Ads.com recently came out with a link calculator to help you estimate the value of a text link advertisement on a publisher’s website.

Just supply the URL to the site and a couple of other bits of information and it will tell you a ballpark price for a link on that page if that site were selling links.

dapoxetine 60 mg price in india
ordering hydrocodone online

Google Sitemaps upgrades help webmasters

The Google Sitemaps team just last week announced a number of changes on their blog.

I was really happy and excited that they appear to’ve done a few of the things I suggested in a post on the Google Sitemaps Group.

They did the following things I had suggested:

There were some additonal things they did which are also interesting:

I’m sure other folx must’ve requested some of the same things I’d suggested, and Google’s good at providing useful features, but it’s really gratifying to see some of the changes I’d wanted showing up now!

Stay tuned for a follow-up posting from me about some of these changes. Some of these new features actually provide some great intel on parameters/methods that Google uses to rank pages.

clenbuterol liquid

If you can’t do good design or good SEO… use witchcraft!

I just read this story on CNN today about how some firms offer to optimize your website through applying principles of vaastu shastra and feng shui to increase usage.

Interesting idea: If you can’t do good engineering for usability, good graphic design, and good SEO to bring traffic to your site, use witchcraft!

muscle growth steroids

The Long Tail and prioritizing your time on design and SEO

I am a big fan of the Long Tail, the term coined by Chris Anderson, Executive Editor of Wired Magazine to refer to what happens in economics when the bottlenecks that stand between supply and demand in our culture start to disappear and everything becomes available to everyone.

In this article I found it quite interesting that UIE applied the concept of the Long Tail to prioritizing where you spend the bulk of your time on design and usability. Sure, there are a few pages that get a large chunk of traffic, such as the home page, but that doesn’t mean that that is where you should spend most of your design time. Instead look at the buckets of pages that add up to a large chunk of your traffic. For example, if all of the articles on your site add up to a large amount of your traffic, then you should spend a reasonable amount of your time in your redesign focusing on the articles template.

I think this same argument applies to search engine optimization (SEO) as well as to design. If your product pages account for 50% of your traffic, half of your SEO time should be spent on the product pages (rather than your articles, FAQs, etc.).

Spend your time on the tail!

дэвид пол

.MOBI Top Level Domain Names Have Misguided Rules

Well the “Sunrise Registration” period for the new .MOBI top level domain names just started up about a week ago, and I have to say that the rules that have been imposed with .MOBI are irritating. The company that serves as the registry for it, “mobile Top Level Domain Ltd” (“mTLD”), has required that anyone who is delivering up content on a .MOBI TLD must deliver up at least the root level page in XHTML-MP format.

According to their mandatory registrant rules, you could just own the .MOBI domain for your site and not publish a site on it — just sit on it, to keep others from hosting stuff on your trademarked name. Once you publish content on the .MOBI domain, at least the root response must be in XHTML-MP flavor, and they will police these domains to insure compliance. Sites not in compliance will be warned, and if they aren’t fixed, their zone file entries will be deleted until the sites are corrected!

Now, I understand that they idealistically want to make the internet world a better place, and they’re seeking to insure consistency by imposing this standard. However, I think they’re misguided and this is a pretty bad business decision. I don’t see anything wrong in having generally thematic rules associated with TLDs, like using .EDU only for educational institutions and .MIL only for military sites. My beef is with having a registry now take on additional powers of setting a required protocol for the content on the site, policing it and checking for validity, and unplugging sites that don’t comply. (more…)

Tag Cloud
RSS Feeds
Categories
Archives
Other