301'ing googlebot
-
I have a client that has been 301’ing googlebot to the canonical page. This is because they have a cart_id and session parameters in urls. This is mainly from when googlebot comes in on a link that has these parameters in the URL, as they don’t serve these parameters up to googlebot at all once it starts to crawl the site.
I am worried about cloaking; I wanted to know if anyone has any info on this.
I know that Google have said that doing anything where you detect goolgebots useragent and treat them different is a problem.
Anybody had any experience on this, I would be glad to hear. -
Hello Alan,
If they are only doing this for Google then it is indeed cloaking, regardless of their intent. That may or may not land them in hot water, but I think Google has provided plenty of other ways to handle this situation.
First and foremost, it would be best if those session IDs and certain other parameters were put into a cookie instead of the URL.
As you probably know, you can tell Google and Bing how to handle the different parameters with their free webmaster tools so it would seem to me like that would be the best approach if you cannot get rid of the parameters all-together.
You can also put a rel canonical tag in the header that references the version of the URL without the parameters.
Force redirecting Googlebot is not a good idea. If, for instance, you wanted to force redirect by IP - as is the case with many global sites that have geo-specific landing pages - then that would be fine as long as you aren't making any special exceptions for Googlebot.
Did this answer your question?
Good luck!
Everett