Would a Search Engine treat a sitemap hosted in the cloud in the same way as if it was simply on /sitemap.htm?
-
Mainly to allow updates without the need for publishing - would Google interpret any differently?
Thanks
-
I didn't run any experiment on this, but I think it can be done from robots.txt referencing the sitemap file. You can read more here -> https://www.sitemaps.org/protocol.html#sitemaps_cross_submits. So basically, you provide the link to the cloud file and tell the crawlers that it is a sitemap for a given website. I don't think Google will treat these files any differently.
[robots.txt ...] Sitemap: https://yourcloudprovider.com/sitemap.htm (or xml or whatever)
Hope this helps.
-
I can second this, it doesn't seem to really matter where you sitemaps are living. Definitely not if you link to them from your robots.txt file as it's a proof that you can influence their location.
-
How can you submit them to Search console if they don't live on your root domain? I understand that you can reference the cloud sitemap URL it in the robots.txt but without it being in Search console you lose visibility to errors and indexing issues.