Natural Search Blog

Lycos in the Search Engine Optimization business?

I had a chuckle when I read this: “Lycos’ Site Side Optimization services” A small-time search engine is now a self-appointed search engine optimization specialist, and here’s the best bit:

Your optimized site is regularly submitted to the free submit at major search engines for indexing.
Regular submissions ensure your site is spidered often by the search engines which results in refreshed content and improved rankings.

Wait a minute here; sites with reasonable links pointing to them never need to submit, and certainly they never need to re-submit! Search engines in actuality don’t like you to resubmit sites they already know about. The majority of sites that “regularly submit” tend to be spam and porn. I can’t remember us submitting a single site to a major search engine in years. and we launch lots of sites all the time. Every one of these sites gets crawled by all the important search engines within days of us placing a link on our home page to the newly launched site.

There must be better ideas for making some money out there, Lycos!


Dubious data from Trellian’s keyword research tool

On the face of it, Trellian’s keyword research tool is quite cool. Paying subscribers can get a full year’s worth of historical keyword popularity data. Finally, a way to quantify the seasonality of various keyword markets!

However, I have to say after using it, I’m not real impressed. The main problem I have with it is that I just don’t find their data to be believable. Too many discrepencies, too many gaps. Let me show you some specifics…

For starters, witness a huge spike in searches for “Christmas shopping” mid-year. Then it’s relatively flat during the Christmas buying season?!? This next one leaves me totally incredulous: no activity whatsoever throughout the year for the search term “holiday shopping” except April, May, and, to a lesser extent, December. Finally, for the very popular search term “shopping,” the month of April appears to have been totally lost.

Christmas shopping keyword search popularity
holiday shopping keyword search popularity
shopping keyword search popularity

So, although the PrioritySubmit tool sounds good in theory, until their data starts looking a lot more credible, I’ll be relying on WordTracker, Overture’s Search Term Suggestion Tool, and Google’s Keyword Sandbox for studying keyword popularity with search engine users. (In case you’re curious, according to Overture’s tool, keyword searches across Yahoo! and the rest of Overture’s network during the month of October for “Christmas shopping” was 13985, for “holiday shopping” was 2751, and for “shopping” was 2273098.)

buy bremelanotide

Google isn’t going to develop a web browser

In a recent blog entry I referred to The Register’s speculation about Google building a web browser to compete with Microsoft’s Internet Explorer. Apparently that isn’t going to happen any time soon. This according to Associated Press:

Chief executive Eric Schmidt has, however, ruled out developing a Google browser to compete with Microsoft’s dominant Internet Explorer.

заказать тут

Is your site unfriendly to search engine spiders like MSNBot?

Microsoft blogger Eytan Seidman on their MSN Search blog offers some very useful specifics on what makes a site crawler unfriendly, particularly to MSNBot:

An example of a page that might look “unfriendly” to a crawler is one that looks like this:….URL’s with many (definitely more than 5) query parameters have a very low chance of ever being crawled….If we need to traverse through eight pages on your site before finding leaf pages that nobody but yourself points to, MSNBot might choose not to go that far. This is why many people recommend creating a site map and we would as well.

buy peptides

Google Scholar – a new search engine for us eggheads

Google has just launched a new search engine called Google Scholar. It’s an engine specifically of scholarly content, such as articles in academic journals. It’s still in beta, so don’t be too hard on Google if it’s not perfect. Danny Sullivan has written an article in SearchDay about the new service. Good on ya, Google!

MarketingProfs webcast on SEO

Well, I presented at another MarketingProfs webcast (webinar) today. This one was on Search Engine Optimization: Maximizing Your Natural Search Channel. Wow did we get deluged with questions at the end! Much more so than the one I did 2 months ago, on unlocking the power of Google as a research tool. I’ll try to respond to the raft of attendee questions and compile them all into a Q&A document for everyone’s enjoyment.

Google’s index hits 8 billion pages. Yes folks, size does matter.

On Wednesday, the day before Microsoft unveiled the beta of Microsoft Search, Google announced that their index was now over eight billion pages strong. Impeccable timing from the Googleplex. Just a couple days later, and Microsoft could have proudly touted its bigger web page index over Google’s. Still, Microsoft’s 5 billion documents is an impressive feat, particularly for a new search engine just out of the blocks. Google continues to show their market dominance, however, with a database of a whopping 8,058,044,651 web pages. Poor Microsoft, trumped by Google at the last minute!

Why the big deal about index size? From the user’s perspective, a search engine that is comprehensive of the Web in its entirety is going to be more useful than one whose indexation is patchy. Which is why I think the Overture Site Match paid inclusion program from Yahoo! is a really bad idea. Sites shouldn’t pay the search engine to be indexed. Rather, the search engine should strive to index as much of the Web as possible because that makes for a better search engine.

Indeed, I see Google’s announcement as a landmark in the evolution of search engines. Search engine spiders have historically had major problems with “spider traps” — dynamic database-driven websites that serve up identical or nearly identical content at varying URLs (e.g. when there is a session ID in the URL). Alas, search engines couldn’t find their way through this quagmire without severe duplication clogging up their indices. The solution for the search engines was to avoid dynamic sites, to a large degree — or at least to approach them with caution. Over time, however, the sophistication of the spidering and indexing algorithms has improved to the point that search engines (most notably, Google) have been able to successfully index a plethora of previously un-indexed content and minimize the amount of duplication. And thus, the “Invisible Web” begins to shrink. Keep it up, Google and Microsoft!

Watch out Google, Yahoo! Here comes MSN Search!

MSN Search BetaToday Microsoft announced the launch of the beta of their new MSN Search, using their own technology that no longer relies on Yahoo! Search Technology. Formal launch of the new search service on is slated for January.

Although functionality is currently somewhat limited and its relevancy algorithm isn’t up to par with Google’s (in my humble opinion), it is clear that Microsoft have great plans in store for their new baby. The possibility of integrating MSN Search with existing, increasingly web enabled, Microsoft products is obvious. MSN Search is already in beta for integration into MSN Messenger and there is talk of using it with wireless applications.

The question that should be on everyone’s mind is: “What does this mean for search engine optimization?” And more specifically, what steps will you have to take to make sure your MSN Search rankings are as good as your Google results? With MSN controlling a 14% share of the search engine market, this is an important question to answer. So let me take a stab at a preliminary answer for you…

First, until MSN Search more fully indexes the web and launches formally to its user base, there is little point in losing any sleep over the issue. Even then, it will take some time for Microsoft to come close to challenging Google’s dominance. Although if Steve Ballmer has anything to say about it, that will happen soon. “We will catch up, we will surpass,” Ballmer was quoted as saying recently at Microsoft’s annual meeting. Empty words? I don’t think so.

Secondly, no rushed changes should be made that could endanger any of your existing rankings in Google and Yahoo!

Finally, talk to the experts to start planning for optimization of MSN Search. Even if you know a few things about search optimization, you don’t know what you don’t know. Your current arsenal of SEO techniques may not work as well on the new engine, and getting qualified advice early could save you a lot of the time and expense incurred with going down a wrong path.
best anabolic steroids

28% of searchers account for 68% of searches

John Battelle shared some interesting search engine usage stats courtesy of Gian Fulgoni of comScore. According to John’s source at comScore, 28% of searchers account for 68% of searches. This comes close to following the 80/20 rule — a bit surprising, don’t you think?

Jeff Jarvis at BuzzMachine expounds further on comScore findings, thanks to the notes he took while attending Gian’s presentation last month at the Web 2.0 conference. According to Jeff, comScore also analyzed brandshare among different types of searchers. They found that heavy searchers look at online retail sites like Walmart, Overstock, Cosco and Amazon, with the low-cost leaders on top. So, I guess you can conclude that heavy searchers are bargain hunters. But do they have a high customer lifetime value, or not? THAT’S the question I’d really like answered!

Top sites by PageRank score

For a very long time I was one of the elite few who knew how to get a list of the top 1000 web pages on the Internet sorted in order of Google’s PageRank importance score. Since this top secret little trick no longer works, I feel I can share it with you all now. 😉

The trick is this: doing a search for http in Google with your Google Preferences set to return 100 results per page used to supply you with the top 1000, at a 100 at a time. Boy that was handy!
stanabol 50

RSS Feeds