Adding your sitemap to robots.txt
-
Hi everyone,
Best practice question:
When adding your sitemap to your robots.txt file, do you add the whole sitemap at once or do you add different subcategories (products, posts, categories,..) separately?
I'm very curious to hear your thoughts!
-
From a crawlability point of view, it does not matter. Search engines have no more problems crawling multiple sitemap files than they do crawling one very large XML sitemap file.
An advantage of splitting out your XML sitemaps is that if your site is very large, you are less likely to run into the 50 MB / 50,000 URL limit. If the site is quite small, you obviously won't benefit from this.
If you use multiple sitemaps, you may already know that you don't have to list them all in robots.txt. You can use a sitemap index file to point to your subcategory sitemaps (e.g. posts.xml etc.) Any modifications to the 'child' XML sitemaps do not need to be updated in robots.txt - you only need to remember to add/remove them from the XML index file and Google/Bing Search Console.
Since many site applications automatically generate XML sitemaps grouped by posts, categories and products etc., we find it's easier to use this default configuration - and simply add the sitemap index URL to robots.txt.
-
Just add the sitemap index file to your robots.txt and let them figure it out from there. You basically just want to point them to your sitemaps and they're able to do that from just the sitemap index. So there's not really a need to list all of them in there.