Welcome to Natural Search Blog
Natural Search Blog provides articles on search engine optimization including keyword reasearch, on-page factors, link-building, social media optimization, local search optimization, image search optimization, and mobile SEO.
In addition to natural search optimization topics, we also cover internet marketing, ecommerce, web design, usability, and technology.
Recent Entries
New WordPress Plugin for SEO
I’ve just released “SEO Title Tag”, a plugin for WordPress. As the name implies, it allows you to optimize your WordPress site’s title tags in ways not supported by the default WordPress installation. For example:
- If you define a custom field (called “title_tag”) when writing or editing a post (or static page), that custom field will then be displayed as the title tag.
- The post title and blog name are reversed for better keyword prominence within the title tag.
- You can shorten or eliminate the blog name altogether from your title tags.
- You can define a custom title tag for your home page through the Options page.
- It will use the category’s description as the title on category pages (when defined).
- If you’re using the UltimateTagWarrior plugin, it will put the tag name in the titles on tag pages.
- It will also cook you dinner and all sorts of other amazing, useful stuff (not really).
Get the plugin now: SEO Title Tag WordPress Plugin
I’d love your feedback, as this is my first WordPress plugin.
Enjoy!
Possible Related Posts
Posted by stephan of stephan on 07/14/2006
Permalink | | Print
| Trackback | Comments Off on New WordPress Plugin for SEO | Comments RSS
Filed under: HTML Optimization, Tools Blog Optimization, blogging, page-titles, plugin, plugins, SEO, title-tags, WordPress
Toolbar PageRank Update
Yep, it’s that time again.
I don’t usually care that much, but we had a little snafu with our PageRank readout on the toolbar for our netconcepts.com site due to a misconfiguration on our end (detailed on my post “Toolbar PageRank Update Is Currently Underway)”, and happily that’s now corrected.
Possible Related Posts
Posted by stephan of stephan on 07/14/2006
Permalink | | Print
| Trackback | Comments Off on Toolbar PageRank Update | Comments RSS
Filed under: Google, PageRank Google, Google-Toolbar, PageRank
Google Sitemaps Reveal Some of the Black Box
I earlier mentioned the recent Sitemaps upgrades which were announced in June, and how I thought these were useful for webmasters. But, the Sitemaps tools may also be useful in other ways beyond the obvious/intended ones.
The information that Google has made available in Sitemaps is providing a cool bit of intel on yet another one of the 200+ parameters or “signals” that they’re using to rank pages for SERPs.
For reference, check out the Page Analysis Statistics that are provided in Sitemaps for my “Acme” products and services experimental site:
It seems unlikely to me that these stats on “Common Words” found “In your site’s content” were generated just for the sake of providing nice tools for us in Sitemaps. No, the more likely scenario would seem to be that Google was already collating the most-common words found on your site for their own uses, and then they later chose to provide some of these stats to us in Sitemaps.
This is significant, because we’ve already known that Google tracks keyword content for each page in order to assess its relevancy for search queries made with that term. But, why would Google be tracking your most-common keywords in a site-wide context?
One good explanation presents itself: Google might be tracking common terms used throughout a site in order to assess if that site should be considered authoritative for particular keywords or thematic categories.
Early on, algorithmic researchers such as Jon Kleinberg worked on methods by which “authoritative” sites and “hubs” could be identified. IBM and others did further research on authority/hub identification, and I heard engineers from Teoma speak on the importance of these approaches a few times at SES conferences when explaining the ExpertRank system their algorithms were based upon.
So, it’s not all that surprising that Google may be trying to use commonly-occuring text to help identify Authoritative sites for various themes. This would be one good automated method for classifying sites for subject matter categories and keywords.
The take-away concept is that Google may be using words found in the visible text throughout your site to assess whether you’re authoritative for particular themes or not.
Â
Possible Related Posts
Posted by Chris of Silvery on 07/11/2006
Permalink | | Print
| Trackback | Comments Off on Google Sitemaps Reveal Some of the Black Box | Comments RSS
Filed under: Google, Tools Algorithms, Authoritative-Hubs, ExpertRank, Google, Hubs, Keyword-Classification, On-Page-Factors, PageRank, Search Engine Optimization, SEO, Sitemaps
Back from July 4th vacation
You may’ve noticed that I took a few days off from the grind to spend the long July 4th weekend with friends, swimming about the Guadalupe River at a ranch in Hunt, Texas.
If you’re curious about what I look at, you can check out this pic of me on flickr that my friend Suzanne took during our trip.
Now that I’m back and maybe starting to get acclimated in my new position at Verizon, I’m hoping to post here a bit more regularly.
Possible Related Posts
Posted by Chris of Silvery on 07/11/2006
Permalink | | Print
| Trackback | Comments Off on Back from July 4th vacation | Comments RSS
Filed under: General personal-notes, vacation
Click Fraud Costs Estimated at over $800M
In Report: Advertisers Cut Spending, Blame Google and Yahoo for Click Fraud, a new report states that advertisers wasted over $800 million last year on phony clicks.
Some points of interest:
- “The Internet advertising market is expected to be worth about $15.6 billion in 2006, up from about $10 billion in 2005.”
- “Google is expected to capture about 25% of that market, compared to Yahoo’s expected 20%, according to research firm eMarketer.”
- PPC therefore is valued at around $7-8 billion this year.
- “15% is estimated as fraudulent”
- “37% of advertisers are reducing their PPC activity”
I predict that this fraud perception will fuel advertisers increasing reliance on natural search, where click fraud is not incentivized.
Will click fraud be the catalyst that finally causes retailers to more equally allocate their spending between PPC (pay per click) and NSO (natural search optimization)? So, for example, shift from $1MM/yr PPC and $150k on NSO, to more like $1MM/yr PPC and $1MM/yr NSO?
As PPC gets more expensive, the act of click fraud gets more costly, and that bad apple must begin to spoil the bucket at some point – not completely I’m sure, but probably enough to cause advertisers to rethink allocation and importance of NSO.
Possible Related Posts
Posted by stephan of stephan on 07/10/2006
Permalink | | Print
| Trackback | Comments Off on Click Fraud Costs Estimated at over $800M | Comments RSS
Filed under: Paid Search click-fraud, Paid Search, ppc
How much is a text link ad worth?
Text-Link-Ads.com recently came out with a link calculator to help you estimate the value of a text link advertisement on a publisher’s website.
Just supply the URL to the site and a couple of other bits of information and it will tell you a ballpark price for a link on that page if that site were selling links.
Possible Related Posts
Posted by stephan of stephan on 07/07/2006
Permalink | | Print
| Trackback | Comments Off on How much is a text link ad worth? | Comments RSS
Filed under: Link Building, Tools link-calculator, text-link-ads, text-link-ads.com, text-links
Google Sitemaps upgrades help webmasters
The Google Sitemaps team just last week announced a number of changes on their blog.
I was really happy and excited that they appear to’ve done a few of the things I suggested in a post on the Google Sitemaps Group.
They did the following things I had suggested:
- Expanded the numbers of top search terms and words commonly found on your site pages from max 20 to 75 (I’d still like to see more);
- They added the ability to see the top search queries lists for a number of the other Google search areas, including Froogle, Google Base, Images, Maps, etc! This is really pleasing!
- They removed common terms that previously appeared in these lists — “html” and “www”.
There were some additonal things they did which are also interesting:
- They show ALL URLs they had trouble crawling now;
- They show top query keywords that your site appears in, broken out by the web properties I mentioned above, AND broken out by country!
- They’ve now provided a robots.txt tool so that the (surprisingly large number of) people who have such difficulty in forming their robots.txt files could have a resource to check their file for them.
- They’ve added a three-value rating scale to each of their main features, allowing one to flag the item as “happy face”, “neutral face”, and “unhappy face”.
I’m sure other folx must’ve requested some of the same things I’d suggested, and Google’s good at providing useful features, but it’s really gratifying to see some of the changes I’d wanted showing up now!
Stay tuned for a follow-up posting from me about some of these changes. Some of these new features actually provide some great intel on parameters/methods that Google uses to rank pages.
Possible Related Posts
Posted by Chris of Silvery on 06/26/2006
Permalink | | Print
| Trackback | Comments Off on Google Sitemaps upgrades help webmasters | Comments RSS
Filed under: Google, Tools Google, Google-Sitemaps, google-webmaster-tools, SEO
If you can’t do good design or good SEO… use witchcraft!
I just read this story on CNN today about how some firms offer to optimize your website through applying principles of vaastu shastra and feng shui to increase usage.
Interesting idea: If you can’t do good engineering for usability, good graphic design, and good SEO to bring traffic to your site, use witchcraft!
Possible Related Posts
Posted by Chris of Silvery on 06/26/2006
Permalink | | Print
| Trackback | Comments Off on If you can’t do good design or good SEO… use witchcraft! | Comments RSS
Filed under: General, News, Search Engine Optimization, SEO feng-shui, SEO, usability, vaastu-shastra, website-design, witchcraft
The Long Tail and prioritizing your time on design and SEO
I am a big fan of the Long Tail, the term coined by Chris Anderson, Executive Editor of Wired Magazine to refer to what happens in economics when the bottlenecks that stand between supply and demand in our culture start to disappear and everything becomes available to everyone.
In this article I found it quite interesting that UIE applied the concept of the Long Tail to prioritizing where you spend the bulk of your time on design and usability. Sure, there are a few pages that get a large chunk of traffic, such as the home page, but that doesn’t mean that that is where you should spend most of your design time. Instead look at the buckets of pages that add up to a large chunk of your traffic. For example, if all of the articles on your site add up to a large amount of your traffic, then you should spend a reasonable amount of your time in your redesign focusing on the articles template.
I think this same argument applies to search engine optimization (SEO) as well as to design. If your product pages account for 50% of your traffic, half of your SEO time should be spent on the product pages (rather than your articles, FAQs, etc.).
Spend your time on the tail!
Possible Related Posts
Posted by stephan of stephan on 06/26/2006
Permalink | | Print
| Trackback | Comments Off on The Long Tail and prioritizing your time on design and SEO | Comments RSS
Filed under: Search Engine Optimization
.MOBI Top Level Domain Names Have Misguided Rules
Well the “Sunrise Registration” period for the new .MOBI top level domain names just started up about a week ago, and I have to say that the rules that have been imposed with .MOBI are irritating. The company that serves as the registry for it, “mobile Top Level Domain Ltd” (“mTLD”), has required that anyone who is delivering up content on a .MOBI TLD must deliver up at least the root level page in XHTML-MP format.
According to their mandatory registrant rules, you could just own the .MOBI domain for your site and not publish a site on it — just sit on it, to keep others from hosting stuff on your trademarked name. Once you publish content on the .MOBI domain, at least the root response must be in XHTML-MP flavor, and they will police these domains to insure compliance. Sites not in compliance will be warned, and if they aren’t fixed, their zone file entries will be deleted until the sites are corrected!
Now, I understand that they idealistically want to make the internet world a better place, and they’re seeking to insure consistency by imposing this standard. However, I think they’re misguided and this is a pretty bad business decision. I don’t see anything wrong in having generally thematic rules associated with TLDs, like using .EDU only for educational institutions and .MIL only for military sites. My beef is with having a registry now take on additional powers of setting a required protocol for the content on the site, policing it and checking for validity, and unplugging sites that don’t comply. (more…)
Possible Related Posts
Posted by Chris of Silvery on 06/20/2006
Permalink | | Print
| Trackback | Comments Off on .MOBI Top Level Domain Names Have Misguided Rules | Comments RSS
Filed under: Domain Names, URLs .mobi, Domain Names, Mobile-Application, TLDs, XHTML-MP