This is sort of good news: Yahoo has announced that site owners who register and verify their sites through Yahoo Site Explorer will be able to specify a small number of URL variables (parameters) that should be ignored. This is mainly useful for parameters/variables like session IDs that don’t actually identify unique content.
An example of a session variable would be something like:
Some shopping carts and other applications fall back on using session variables in the URL, when the visitor doesn’t accept a session cookie. Since spiders don’t accept ANY cookies, you have to write your applications to recognize spiders and not give them a session ID, or only create a session when the visitor does something (like adding to their shopping cart) that you need to track.
Unfortunately, since Yahoo has decided to do this on their own, other search engines will continue to have problems with poorly written applications.
What the SEO world really needs is for the search engines to all come together, as they did with the Sitemap protocol, and come up with one standard method that would allow site owners to tell them which variables to ignore. Such as extending the robots.txt file with a list of variables to ignore.
The fact that Yahoo is going it alone on this one is a pretty strong indication to me, that whatever conversations are taking place between the search engines on protocols, this isn’t seen as an important issue. Kudos to Yahoo for exhibiting some leadership, but this is a problem that should have been solved ages ago.