Questions
-
URL Best Practices (for site with millions of records)
Isn't New-York just a waste of url length? Shouldn't it be New-York-City,-NY? And you're busting the recommended url length. Adrian
Local Listings | | abtechgroup1 -
Can you confirm legitimate Google Bot traffic?
Thank you very much for all of your feedback folks. I really appreciate it. Cloudflare has acknowledged that they are aware of the issue and working to correct it. It also affected Bing Bots. I ended up white listing the IP ranges, and our Google crawl has increased as a result.
Technical SEO Issues | | akin671 -
Help recover lost traffic (70%) from robots.txt error.
Firstly, I would definitely take the opportunity to switch to SSL. A migration to SSL shouldn't be something to worry about if you set up your redirects properly, but given that most of your pages aren't indexed at all, it is even less risky. You will eventually get the traffic back, as far as how long, it's very difficult to say. I would concentrate on crawlability, and make sure your structure makes sense, and that you aren't linking any 404's or worse. Given the size of your site, that wouldn't be a bad thing anyway. From your description of your pages, I'm not sure there is any "importance hierarchy", so my suggestion may not help, but you could make use of Google's API to submit pages for crawling. Unfortunately, you can only submit in batches of 100 and you are limited to 200 a day. You could, of course, prioritise or cherry pick some important pages and "hub" pages, if such things exist within your site, and then start working through those. Following the recent Google blunder where they deindexes huge swathes of the web and, in the short term, the only way to get them back in the index was to resubmit them, someone has provided a tool to interact with the API, which you can find here: https://github.com/steve-journey-further/google-indexing-api-bulk
On-Page / Site Optimization | | Xiano1