What is the largest page size a searchbot will crawl?
-
When setting up pagination, what should we limit the page size to? When will a searchbot stop crawling a particular page?
-
I think the ideal size is below 500k, yet I feel Google will crawl even larger sized pages if the content provides value for the users.
Back in 2005 I remember Google had much tighter figures on these types of numbers yet in today's market it is a bit different, they seem to allow larger file sizes.