Can I limit Moz crawl to site map?
-
I have a Woo Commerce site with 143 pages. When I crawl in Moz, it is identifying 49,000 pages (and errors for these 49,000 pages such as missing meta data). Is there a setting in Moz to limit the crawl to the site map, or is there a robots.txt setting that would prevent this? Any other suggestions would be appreciated.
-
What you offer in your sitemap in relation of pages is what Moz will crawl (hopefully).
-
It's generating thousands of pages from my woo-commerce site which only has 143 pages. They look like this (this is one page):
-
Looks like a URL error / mistake, normally it shoud'nt go beyond /shop/product/type/ etc and not /shop/product/type/product/type/product/type/
Try to generate a proper sitemap; and then resubmit to MOZ.
-
Thanks for the help. I have a proper site map, and Google Analytics is reading the site and the 143 pages just fine. So not sure what's up. It does this in Moz, SEMrush and Screaming Frog, but not GA.
-
Hi Brad!
Moz doesn't actually crawl Sitemaps but we do follow robots.txt directives. I think what is happening is we are possibly reading relative links in your source code and appending that link to the one the crawler is currently on.
If you can reach out to support at help@moz.com, we can take a look at your campaign and provide some additional insight here!
Best regard,
-
Thanks, Dave. I will do that.
Brad