Google Webmaster Tools Parameters
-
We have several large ecommerce websites, and we've added some tracking parameters to GWT for google to ignore. All pages are correctly canonicaled.
Google has been ignoring the parameters and the canonicals, and still ranks many parametered pages for us. Has anyone run into this?
-
Hi there
When did you put these parameters in place? Sometimes it can take Google some time to crawl your site and see these requests via your GWMT. So, depending on how recently you implemented these parameters, you may still see some of these URLs ranking.
Do you have these parameters in your sitemap or internal links where they are not needed? If so, make sure you correct them.
You can also look into robots.txt opportunities with your web development team to block these parameters, although I would really research this so you don't accidentally block important URLs.
Hope this helps! Good luck!
-
Very often, it isn't enough to only try and complete this through parameter handling. As Patrick said, you can look at adding sets of 'disallows' in Robots.txt and no-index at a page level, if you want to have pages de-indexed, but be careful how you do this and make sure that Google can see and action a no-index before blocking access to the page.
-Andy
-
Patrick:
I set up the "clicksource" parameter in Google Webmaster Tools in March, and we still have 2400 pages with parameters indexed.
-
Building on Patrick's answer using Robots.txt was the fastest way for things to take effect, but if implemented wrong can impact you pretty bad.
In addition to adding the exclusions in the Robots.txt, I also put in a removal request to remove the pages with parameters from googles index using the Google Url Removal tool, this combined with the other options helped clean up my results in the google index.
Hope this helps.