Natural Search Blog


SMX LoMo Keynote: Frazier Miller

Frazier Miller, General Manager of Yahoo! Local, spoke yesterday here at the SMX Local & Mobile conference in San Francisco.

Frazier Miller, Yahoo! Local's General Managers
Yahoo! Local’s Frazier Miller

It was very interesting to hear the take on local & mobile from one of Yahoo! Local’s top thought leaders. It was obvious that Frazier has a very tight grip on understanding what motivates consumers and where the trends may be headed in local/mobile evolution.

Some highlights of Frazier’s presentation included: (more…)

Yahoo Collaborates With McAfee To Secure Search Results

It was announced this week that Yahoo! and McAfee are teaming up to help fight malware. Yahoo’s Search team will take McAfee information on malicious sites and use that to filter those sites out of their search results. In addition, McAfee can take some data from Yahoo’s search results to help them identify more malicious domains. (more…)

Using Flickr to Optimize for Yahoo Image Search

Google Blogoscoped reports that Yahoo’s Image Search now particularly likes Flickr content, so this may be incentive for webmasters to use Flickr “as a kind of Yahoo search engine optimization”. My frequent readers know that I’ve been advocating using Flickr for image search optimization for some time now, and I’ve been speaking on this subject at Search Engine Strategies conferences as well.

The Blogoscoped mention of Yahoo’s love for Flickr content is particularly timely, since Yahoo! announced back in June that they were permanently shutting down Yahoo! Photos in favor of their Flickr property, and the final closing date is tomorrow, September 20th.

Previously, I’d railed a bit against Yahoo! because I’d seen a lot of evidence that they didn’t spider/index Flickr content as well or comprehensively as Google did — altogether ironic since Yahoo owns Flickr. Just as with the anecdotal reports in the Blogoscoped post, I’m seeing nice indications that my earlier criticism of Yahoo’s lack of inclusion of Flickr content may now be completely resolved. (more…)

Now MS Live Search & Yahoo! also treat Underscores as word delimiters

So, I earlier highlighted how Stephan reported on Matt Cutts revealing that Google treats underscores as white-space characters. Now Barry Schwartz has done a fantastic follow-up by asking each of the search engines if they also treated underscores just like dashes and other white space characters, and they’ve verified that they’re also handling them similarly. This is another incremental paradigm shift in search engine optimization!

I’ve previously opined that classic SEO may become extinct in favor of Usability, and announcements like this fluid handling of underscores would tend to support that premise. Google, Yahoo! and MS Live Search have been actively trying to reduce barriers to indexation and ranking abilities by changes like this plus improved handling of redirection, and myriad other changes which both obviate the need for technical optimizers and reduce the ability to artificially influence rankings through technical improvements.

I continue to think that the need for SEOs may decrease until they’re perhaps no longer necessary, so natural search marketing shops will likely evolve into site-building/design studios, copy writing teams, and usability research firms. The real question would be: how soon will it happen?

anavar detection time

Yahoo’s Recent Spider Improvement Beats Google’s

Googlebot Spider

Yahoo!’s Search Blog announced yesterday that they were making some final changes to their spider, (named “Slurp”), standardizing their crawlers to provide a common DNS signature for identification/authorization purposes.

Previously, Slurp’s requests may have come from IP addresses associated with inktomisearch.com, and now they should all come from IPs associated with domains in this standard syntax:

[something].crawl.yahoo.net

(more…)

Google, Yahoo & MicroSoft to Cooperate on Sitemaps

I was delighted today that the Google and Yahoo search engines announced at PubCon that they would jointly support and collaborate upon one protocol for webmasters to use for submitting their site URLs for potential inclusion. View the video of the announcement here. MicroSoft has also apparently agreed to use the same protocol as well.

To support this initiative, they will jointly support sitemaps.org. If you recall, “sitemaps” was the product name that Google had been using, and which became deprecated just a few months ago in favor of “Google Webmaster Tools”. Obviously, the wheels had already begun turning to repurpose the “Sitemaps” brand name into a jointly-operated service.

Now when Sitemaps are generated to follow the common protocol, webmasters will still need to submit the link feeds to each of the SEs via their existing managment tools such as in Google Webmaster Tools and in Yahoo! Site Explorer.

If you recall, I was one of a number of webmasters out there who had requested that they collaborate on a common protocol, such as in a blog post I wrote back in September:

“Hopefully each of the major search engines will try to employ identical or compatible formats for site URLs, because it will be a hassle to have to keep up with multiple formats. This is an area where the SEs really ought to cooperate with one another for “pro bono publicoâ€? – for the common good. Currently, Yahoo seems to be just defensively immitating Google in this arena, and no one’s showing signs of collaborating.”

Kudos to Google and Yahoo for overcoming traditional corporate competitiveness to do something that mutually benefits website owners as well as the search engines!

 

akbands.com 20reps htm

SEO May Be Eclipsed by User-Centered Design

I’ve been seeing indications that Google has shifted their weighting of the ~200 various signals they use in their ranking soup over the past couple of years. It used to be that PageRank along with the number of keyword references on a page were some of the strongest signals used for what page comes up highest in the search results, but I’ve seen more and more cases where PageRank and keyword density seem relatively weaker than they once were. I see a lot of reasons to believe that quality ratings have become weighted more heavily for rankings, particularly among more popular search keywords. Google continues to lead the pack in the search marketplace, so their evolution will likely influence their competitors in similar directions, too.

So, what is my evidence that Google’s development of Quality criteria is becoming more influential in their rankings than PageRank and other classic optimization elements? Read on and I’ll explain. (more…)

To Use Sitemaps, or Not To Use Sitemaps, That’s the Question

It was really great when Google launched its Sitemaps (recently renamed to Webmaster Tools, as part of their Webmaster Central utilities) – when that happened it was a really great indication of a new time where technicians who wished to help make their pages findable would not automatically be considered “evil” and the SEs might provide tools to help technicians disclose their pages directly. Yahoo soon followed with their own tools, named Yahoo! Site Explorer, and surely MSN will bow to peer pressure with their own submission system and tools.

Initially, I thought that there wasn’t significant advantage to me for using these systems, because I’d already developed good methods for providing our page links to the search engines through the natural linking found in our site navigation systems.

Why should I expend yet more time and resources to dynamically produce the link files?

(more…)

Flickr Adds Geotagging Features

In a move that proves that the people behind flickr are still channeling the Web 2.0 mass conscious, flickr announced this week that they’re adding Geotagging features to their already-robust suite of image management products.

As you may recall, I previously blogged a bit about the rise of geotagging, particularly geotagging of photos, and I had said that it seemed to be a really strong idea with a lot of potential uses. It’s gratifying to see that a service like flickr (and a company like Yahoo!) also believes that it will be strategically beneficial.

The number of people who have been geotagging or who even know about it is likely a relatively low percentage of the online populace, I’d guess (partly for the reason that most people don’t have a GPS device to tell them a location’s longitude and latitude). Now that a top-ranked photo site is supporting it expressly, droves of users will become educated about it, and experiment with it. By doing this, flickr is propelling the trend into the mainstream, increasing the likelihood that it’ll be more widely adopted.

Flickr’s new geotagging utilities were built by mashing-up their image management utilities with Yahoo! Maps, allowing users to drag pix onto a mapped location of where the image was taken in order to associate the photo with the geotag. Also, it appears that users could now use a graphic map as a navigational interface to browse geographic locations and then pull up any publicly-available photos associated with that location. Read on for more info.

(more…)

Yahoo update beefs up on authority sites

Aaron Wall posted a blog about how Yahoo!’s recent algorithm update has apparently increased weighting factors for links and authority sites.

Predictibly, a number of folx have complained in the comments added to Yahoo’s “Weather Report” blog about the update. Jeremy Zawodny subsequently posted that their search team was paying close attention to the comments, which is always nice to hear.

Coincidentally, I’d also just recently posted about Google’s apparent use of page text to help identify a site’s overall authoritativeness for particular keywords/themes.

As they say, there’s nothing really new under the sun. I wonder if the search engines are all returning to the trend of authority/hub focus in algorithm development? It’s a strong concept and useful for ranking results, so the methodology for identifying authorities and hubs is likely here to stay.

RSS Feeds
Categories
Archives
Other