Natural Search Blog


Google Purchase of DoubleClick Under FTC Investigation

The NY Times reports that the U.S. Federal Trade Commission has started a preliminary antitrust investigation into Google’s planned $3.1 billion purchase of the online advertising company DoubleClick, an industry executive briefed on the agency’s plans said yesterday.

Some consumer groups have raised questions about privacy issues involved in having companies which handle more and more of the end-to-end process in users’ clickstreams through the internet, since holding more of the links in the process chain inevitably means being able to ascertain individual’s actions, interests, motives and desires in their day-to-day lives. Read on for more info.

(more…)

Advanced Search Engine Optimization for College & University Websites

I earlier posted some basic tips for SEO of University & College websites here on Natural Search Blog. I’m now circling back around to post some advanced tips for optimization of .EDU sites. Some of these tips are more along the lines of helping out with overall marketing, though using the college’s or university’s web presence to accomplish it. Even those ancillary efforts can contribute to the overall online marketing and natural search optimization success.

EDU

I wrote those tips after getting a number of interested questions from educational professionals attending the AMA Hot Topics seminar that Stephan and I provided in San Fran a few weeks ago. Read on for the details of my .EDU secret sauce!

(more…)

AMA Hot Topic Series: Search Marketing in San Fran

The San Francicso leg of the American Marketing Association’s Hot Topic Series on Search Marketing this past Friday was really great! The crowd was intimate, which allowed all of us speakers to mingle and have some quality discussions with folx, and the seminar/conference/workshop was excellently organized.

Read on for more details about the AMA Hot Topic Series day’s sessions.

(more…)

Dupe Content Penalty a Myth, but Negative Effects Are Not

I was interested to read a column by Jill Whalen this past week on “The Duplicate Content Penalty Myth” at Search Engine Land. While I agree with her assessment that there really isn’t a Duplicate Content Penalty per se, I think she perhaps failed to address one major issue affecting websites in relation to this.

Read on to see what I mean.

(more…)

In other news, a new free Clinic

Search Engine Journal today opened free SEO Clinic for sites in need of optimization or with specific challenges that have not been overcome.

A group of leading SEOs including Carsten Cumbrowski, Ahmed Bilal, and Rhea Drysdale will review one submission per week delivering a thorough review of usability and site navigation, link building, and copywriting from the perspective of placement in the four leading engines (Google, Yahoo!, MSN and Ask).

It’s clear though that “free” is as free as having your site criticized in one of the SEO clinics experts like to host at conferences.  If chosen for review, the findings and recommendations will be posted for others to peruse.  I’d do as much myself and appreciate their efforts to help others with these case studies but as a website owner, someone responsible for SEO, or marketing manager for a major brand, I might not be so inclined to have my successes and failures outlined in detail for everyone to see.  That concern aside, I do hope they get some quality sites and develop a thorough library of reviews (perhaps I’ll sign up myself!).

To participate, simply contact the team here.

VPS and Cloud Hosting
nandrolone decanoate 200

Nouveau Meta Tags for SEO

Back in the earliest days of search optimization, meta tags were a great channel for placing keywords for the search engines to associate with your pages. A meta tag does just what it sounds like — they are the html tags built to hold metadata (or, “data describing the data”) about pages. In terms of SEO, the main meta tags people refer to are the Keywords and Description meta tags. Meta tags are not visible to endusers looking at the page, but the meta tag content would be collected by search engines and used to rank a page — it was really convenient if you wanted to pass synonyms, misspellings, and various term stems along with the specific keywords.

Classic Meta Tags - people used to pack keywords into metatags

Immediately after people realized that meta tags could allow a page to be found more relevant in the major search engines, unscrupulous people began abusing the tags by passing keywords that had little or nothing to do with the content of their sites, and the search engines began to reduce using that content for a keyword association ranking factor because it couldn’t be trusted. Eventually, search engines pretty well dropped using them for ranking altogether and newer search engines didn’t bother to use them at all, leading Danny Sullivan to declare the death of the metatags in 2002.

Fast forward to 2006, and the situation has changed yet again. Your meta tag content can once again directly affect your pages’ rankings in the SERPs!

(more…)

Using Flickr for Search Engine Optimization

I’ve previously blogged about optimization for Image Search. But, images can also be used for optimization for regular web search as well. Where online promotion is concerned, it appears to be an area for advantage which remains largely untapped. Many pros focus most of their optimization efforts towards the more popular web search results, and don’t realize that optimizing for image search can translate to good overall SEO.

Flickr is one of the most popular image sharing sites in the world, with loads of features that also make it qualify as a social networking site. Flickr’s popularity, structure and features also make it an ideal vehicle for search engine optimization. So, how can image search optimization be done through Flickr? Read on, and I’ll outline some key steps to take. (more…)

Google Sitemaps Reveal Some of the Black Box

I earlier mentioned the recent Sitemaps upgrades which were announced in June, and how I thought these were useful for webmasters. But, the Sitemaps tools may also be useful in other ways beyond the obvious/intended ones.

The information that Google has made available in Sitemaps is providing a cool bit of intel on yet another one of the 200+ parameters or “signals” that they’re using to rank pages for SERPs.

For reference, check out the Page Analysis Statistics that are provided in Sitemaps for my “Acme” products and services experimental site:

Google Sitemaps Page Analysis

It seems unlikely to me that these stats on “Common Words” found “In your site’s content” were generated just for the sake of providing nice tools for us in Sitemaps. No, the more likely scenario would seem to be that Google was already collating the most-common words found on your site for their own uses, and then they later chose to provide some of these stats to us in Sitemaps.

This is significant, because we’ve already known that Google tracks keyword content for each page in order to assess its relevancy for search queries made with that term. But, why would Google be tracking your most-common keywords in a site-wide context?

One good explanation presents itself: Google might be tracking common terms used throughout a site in order to assess if that site should be considered authoritative for particular keywords or thematic categories.

Early on, algorithmic researchers such as Jon Kleinberg worked on methods by which “authoritative” sites and “hubs” could be identified. IBM and others did further research on authority/hub identification, and I heard engineers from Teoma speak on the importance of these approaches a few times at SES conferences when explaining the ExpertRank system their algorithms were based upon.

So, it’s not all that surprising that Google may be trying to use commonly-occuring text to help identify Authoritative sites for various themes. This would be one good automated method for classifying sites for subject matter categories and keywords.

The take-away concept is that Google may be using words found in the visible text throughout your site to assess whether you’re authoritative for particular themes or not.

 

Optimize your roof ads for Google Maps

Since SEMs and SEOs are trying to use every way possible to increase their site exposure and ad visibility in the search engines, I thought it would be a good time to provide some tips on how to properly and effectively optimize your rooftop ads to appear in Google Maps.

Now, Danny Sullivan claimed that logos on rooftops are not intended for Google Maps, but this assertion is no longer correct, since I heard a recent segment in the last week on NPR about a rooftop ad company which is specifically gearing their ads to appear on the satellite images.

An article on Wired about that same company, RoofShout.com [7/14/08: link is now defunct], indicates that this may indeed be a viable new ad medium. For tips about how you can optimize for the rooftop media (which I will refer to as “SkySense Ads”), read on…
(more…)

Need more traffic? Try Image Search Optimization

With all the focus on optimization of textual page content and near-obsessive concentration on text-oriented web search engine results pages (“SERPs”), most webmasters and SEOs neglect an area of their potential repertoire which could provide a lot of benefit to their site and business: image search optimization.

One aspect of effective optimization is to keep your eyes open for all the various avenues for referral traffic which can convert to a sale on your site. Depending upon the products or services you offer, it may be very valuable to consider the possibilities of optimizing for the Image Search utilities offered by the various search engines. Even if your site isn’t a product or services website, if you’re looking to increase organic referral traffic, optimizing for image search could work well for you. Read on and I’ll explain…
(more…)

RSS Feeds
Categories
Archives
Other