Is there a way to keep sitemap.xml files from getting indexed?
-
Wow, I should know the answer to this question.
Sitemap.xml files have to be accessible to the bots for indexing they can't be disallowed in robots.txt and can't block the folder at the server level.
So how can you allow the bots to crawl these xml pages but have them not show up in google's index when doing a site: command search, or is that even possible? Hmmm
-
Usually you would need to add noindex to the meta robots tag in the of your web page. Because your sitemap is an XML file and not HTML you will need to do things differently.
You can add the code below to your .htaccess file, which you can find in the ROOT folder of your server. Open the file in a plain text editor and insert the following:
Header set X-Robots-Tag "noindex"
This will stop search engines indexing your sitemap without restricting them from crawling it.
Note: Replace 'sitemap.xml' with your file name - if different.
-
ahhh no index in .htaccess file. brilliant - thanks!