Questions
-
Very well established blog, new posts now being indexed very late
The robots.txt file is designed to completely block content. Normally, if your robots.txt file was a factor then your content would not appear in SERPs at all. It is possible for content to appear in SERPs even though it is blocked by robots.txt if it is linked from other sources. Since this is new content, it is less likely that is the case unless you are immediately sharing links and Google is seeing those links within the time frame you shared. The first place I would look is your sitemap or whatever tool is used to inform Google that you have new content. When you publish a new blog article, your software should ping Google and inform them there is new content. That is where any investigation should begin. Next step is to check server logs to see how long it takes Google to respond to the alert. If it takes them 12 hours, then there is nothing further you can do about it. I would be interested in a lot more detail. How many articles how you confirmed as being affected by this issue. Exactly how did you confirm the issue? As a side note, your robots.txt file is bloated and doesn't adhere to any standards I have seen. How exactly was it created? Did someone go in and make manual modifications to the file?
Technical SEO Issues | | RyanKent0 -
Please help me stop google indexing https pages on my wordpress site
Hi Rookie123 Having unique IP's may be of use from an SEO perspective if you have multiple sites that you are interlinking with the hope of boosting SEO. However there is no guarantee that Google will not be able to work out that the sites are still in fact related. If this is a single stand alone site then I would not worry about having a unique IP as I am not aware of any tangible benefit and would be more focused on resolving the duplicate content issue. Hope this helps.
Content & Blogging | | CPU0