Will using http ping, lastmod increase our indexation with Google?
-
If Google knows about our sitemaps and they’re being crawled on a daily basis, why should we use the http ping and /or list the index files in our robots.txt?
- Is there a benefit (i.e. improving indexability) to using both ping and listing index files in robots?
- Is there any benefit to listing the index sitemaps in robots if we’re pinging?
- If we provide a decent <lastmod>date is there going to be any difference in indexing rates between ping and the normal crawl that they do today?</lastmod>
- Do we need to all to cover our bases?
thanks
Marika
-
Will using http ping, lastmod increase our indexation with Google?
No. You can submit a perfect sitemap and ping Google with changes every hour, but that will not increase the number of pages which are indexed.
A few good sources discussing sitemaps and indexing:
http://followmattcutts.com/2010/03/23/matt-cutts-on-sitemap-indexing/
http://faq.bloggertipsandtricks.com/2010/08/html-xml-sitemap-what-difference-matt.html
If you have a site with solid navigation, good architecture and links, then there is no need to use a sitemap. Search engines will determine how often your site should be crawled based on your site's authority. They can also determine which pages have been modified by comparing the header dates with their database.
I still use a sitemap, but it's mostly because the process is fully automated. I know of other sites that are well indexed which do not use site maps at all.
With the above understood, I'll try to offer a bit more information directly related to your questions. When you ask about pinging, I presume you are referring to mainly Google and Bing. For those cases, the answers to all four of your questions is NO.
Listing your sitemap location in robots.txt will help other search engines whom you did not ping to locate your sitemap. This can include the SEOmoz crawler, for example.