Google indexing is slowing down?
-
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap.
We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now.
Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks!
-
Ryan,
G doesn't crawl/index every url it encounters. Its decision to do so is based on it perceived value of the the page/the page the link is one. You can increase that value/PR/PA by sculpting your internal links, revised silo-ing of your content and/or acquiring backlinks to pages deeper within your site.
If you're a brand new site with 20M pages, what reason have you give google to crawl all those pages. Just submitting them or listing them on a sitemap isn't enough of a reason. I mean, 20M pages! google's like damn dude, what's up with all that?
-
Share your URL for better responses.
Have you submitted a complete XML sitemap through Google Search Console? And are all the pages free of a no-index meta tag?
20 million unique pages can only be created in some automated fashion based on a much smaller amount of original content, so Google may perceive these pages to be spammy and not worth indexing.
-
Try getting links for important pages. As for google crawling and indexing: i have websites that contain 10k of URL's on which i know, half of that is not relevant for search at all. Google does it completely automaticly and cuts out the most duplicate or non-relevant pages in comparison to what i offer in my sitemap and brings 4k of pages or so in actual search.
It's really no issue. You cant tell me you got 20 million of unique (user) generated content.