Thanks, Dave. I will do that.
Brad
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thanks, Dave. I will do that.
Brad
Thanks for the help. I have a proper site map, and Google Analytics is reading the site and the 143 pages just fine. So not sure what's up. It does this in Moz, SEMrush and Screaming Frog, but not GA.
It's generating thousands of pages from my woo-commerce site which only has 143 pages. They look like this (this is one page):
I have a Woo Commerce site with 143 pages. When I crawl in Moz, it is identifying 49,000 pages (and errors for these 49,000 pages such as missing meta data). Is there a setting in Moz to limit the crawl to the site map, or is there a robots.txt setting that would prevent this? Any other suggestions would be appreciated.