Issue with Site Map - how critical would you rank this in terms of needing a fix?
-
A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users.
We have around 2300 pages of content, and around 600-800 of these previously excluded URLs,
An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users).
The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering:
-
How much of a critical issue would you view this?
-
Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap.
Thanks
Kate -
-
- It's not a perfect solution and if you can make the costs than I would always do it, but honestly I can't say this is very critical right now to fix. Google and other search engines want you to have a very high quality sitemap so that all pages in there will exist and work for them as well as for users but if a certain percentage of them won't work then it won't get you massively in trouble I'd say.
- I'm not really sure about this option as it doesn't sound like an actual fix at all so for now I would say don't do it.
-
Agree with Martijn. Technically, Google does not want any pages in the sitemap that 4xx or 3xx or have any other result than a 200.
I would say this, if your site is being rebuilt, having a sitemap that is accurate and that updates when you update the site is a basic requirement. The fact that there is a high cost for fixing this issue is baloney. It sounds like the devs did not build the site correctly the first time if they do not have a way to update the sitemap automatically.
You could generate the sitemap yourself
You can use tools like
http://tools.seochat.com/tools/online-crawl-google-sitemap-generator/
Or read tutorials on how to use Screaming Frog to create a sitemap
http://www.hmtweb.com/marketing-blog/dirty-sitemaps-how-to-download-crawl/
Frankly, the annual cost of Screaming Frog (about $150 a year) gets you so much more than just sitemaps. Buy Screaming Frog, have it generate your sitemap and ask the devs to upload it. If you have a site with several thousand pages, just running Screaming Frog monthly would help you find issues on your site that is well worth the cost. Search "Screaming Frog" here in the forums and you can see that this is one of the "swiss army knives" of technical SEO.
-
This post is deleted!