I am looking for best way to block a domain from getting indexed ?
-
We have a website http://www.example.co.uk/ which leads to another domain (https://online.example.co.uk/) when a user clicks,in this case let us assume it to be Apply now button on my website page. We are getting meta data issues in crawler errors from this (https://online.example.co.uk/) domain as we are not targeting any meta content on this particular domain. So we are looking to block this domain from getting indexed to clear this errors & does this effect SERP's of this domain (**https://online.example.co.uk/) **if we use no index tag on this domain.
-
If you use the meta noindex on page then it will be blocked from the index. So, yes, it will effect the SERPs of that domain by removing any results from the SERPs.
-
Hi thanks for letting me know this , it will be great if you have any wildcard to solve this
-
I'd recommend putting a robots.txt file on the https://online.example.co.uk site you don't want indexed.
Just save the following as robots.txt:
User-agent: *
Disallow: /then add it to the the root folder of the site like https://online.example.co.uk/robots.txt
This tells all the search spiders not to index or crawl any pages on the entire site.
-
Hi the real challenge here is we are not using any Google entities like webmaster tools etc on https://online.example.co.uk so to my knowledge robots.txt wont work, will be waiting to hear from you if we have any other options here.
Thanks
P