Google Indexing of Site Map
-
We recently launched a new site - on June 4th we submitted our site map to google and almost instantly had all 25,000 URL's crawled (yay!).
On June 18th, we made some updates to the title & description tags for the majority of pages on our site and added new content to our home page so we submitted a new sitemap.
So far the results have been underwhelming and google has indexed a very low number of the updated pages. As a result, only a handful of the new titles and descriptions are showing up on the SERP pages.
Any ideas as to why this might be? What are the tricks to having google re-index all of the URLs in a sitemap?
-
Hi there
Did you read through Google's indexing resources?
I would also try doing a quick "site:yourdomain.com" and see how many pages Google pulls up - that's a more accurate representation of what's indexed from your site. This is reflected in the resource above:
"Sometimes the data we show in Index Status is not fully reflected in Google Search results." I suggest reading through the resource and also performing that search. Google indexing your sitemap is a waiting game, you're on the watch, just be patient!
Hope this helps! Good luck!
-
In the 1000's of sites we have submitted, all show an initial spike in ranking and indexing before things settle down for the long haul. It seems like Google does a "best guess" scenario, before they take the time to fully crawl and analyze all of the URL's and rank them accordingly. As always, resubmit the pages through all webmaster tools (Bing too!) so that they are always aware of the most recent updates. If you are planning on updating the pages frequently, I would edit your crawl request to daily in your sitemap. They probably won't do it anyway, but you can try

Use the fetch as Google religiously when you update. It is your friend
-
David,
Thanks for your response. This is exactly what we've seen with the initial spike in ranking and now with things settling down. I'll make sure the team has the crawl requests to daily (which I think it is).
For fetch as google - what's the best way that you've used this? For example, i just checked a page and saw that some images weren't being indexed. If I correct the issue, can I just use "Submit to Index"?
Thanks!!!!
-
Thank you!! I'll take a look through the google resource. Also the site:domain search reviled 35,000 results.
The results are there, just not reindexed.
-
No problem, its actually really easy:
https://www.google.com/webmasters/tools/googlebot-fetch
Once you have selected your account, add the URL and then submit to index. I would do the homepage first and for that page, use the "Crawl this URL and its direct links" option. Then for the subpages do the "Crawl only this URL" option. It can also help to do the "Crawl this URL and its direct links" for any of your top level menu items to help speed things up.
"For example, i just checked a page and saw that some images weren't being indexed." Does your robots file allow specific access to those pages? If not, here is how you can set it to do so. This will also allow Google's partners to access your images. Add this to the bottom of your robots file:
User-agent: Googlebot-Image
Allow: /images/
User-agent: Adsbot-Google
Allow: /
User-agent: Googlebot-Mobile
Allow: /
User-agent: Mediapartners-Google*
Allow: /
Sitemap: http://www.YOURSITEHERE.com/sitemap.xml