Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
-
Hey guys. Wondering if someone can help diagnose a problem for me.
Here's our site: https://www.flagandbanner.com/
We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place!
Here's the robots.txt file:
User-agent: *
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/*Sitemap: https://www.flagandbanner.com/images/sitemap.xml
Anyone have any thoughts as to what our problems are??
Mike
-
Thanks for the question!
First, it is very common to get inconsistent answers from GSC, site:, sitemap and crawl results. Don't worry too much about that.
Your goal is to get as many of your pages indexed and that is a function of links pointing to your site and internal link structure. While it is an imperfect analogy, we often refer to this as "crawl budget". There are essentially 2 solutions to this...
1. Get more/better backlinks to a diversity of pages on your site.
2. Improve your internal link architecture so that Googlebot finds your pages more quickly.
I think the problem in your case is that the site inundates bots with generic navigational links. For example, this page...
http://www.flagandbanner.com/products/chrome-air-force-lt-general-flag-kit.asp
has 1400 internal links! That is crazy!
This page has 1500!
https://www.flagandbanner.com/products/citizenship-gifts.asp
You need to reel this back in dramatically. Your navigation should like to top level categories or maybe a handful of subcategories. Once in a category, you can reveal deeper categories. This will increase the likelihood that the related and "also" buy links that you find on product pages will get found and followed by Googlebot.
Finally, on a different note, you need to make sure you standardize the casing of URLs (ie: /Products/ or /products/) I noticed that you have links both internal and external that do not take this into account, causing unnecessary duplicate content.
-
Thanks so much for your response, Russ.
You're confirming one of the many issues we have identified (too many internal links) but I had not connected it to indexing or site speed. When I use the Google Page Speed Tool, many of our pages are not even registering. It seems like it's taking too long to load them so it times out. Could the crazy amount of links have to do with this, too?
Moreover, our mobile speed is especially poor. This could be an even bigger problem in mobile, no?
Are you familiar with .asp sites, in particular, having indexing issues...or is that a false assumption?
Mike
-
A site running ASP should be perfectly fine. I bet you will see substantial increases in a lot of positive metrics by just pairing down that navigation.