Adding directories to robots nofollow cause pages to have Blocked Resources
-
In order to eliminate duplicate/missing title tag errors for a directory (and sub-directories) under www that contain our third-party chat scripts, I added the parent directory to the robots disallow list.
We are now receiving a blocked resource error (in Webmaster Tools) on all of the pages that have a link to a javascript (for live chat) in the parent directory.
My host is suggesting that the warning is only a notice and we can leave things as is without worrying about the page being de-ranked/penalized.
I am wondering if this is true or if we should remove the one directory that contains the js from the robots file and find another way to resolve the duplicate title tags?
-
Hi there
Don't block pages or directories because they have duplicate/missing titles - instead, fix them - those pages may provide value and you could be missing some big opportunities.
"...we can leave things as is without worrying about the page being de-ranked/penalized." - you're blocking the page in your robots.txt, so you're already not being indexed. If you are looking to index these pages, you need to remove that robots.txt.
Go through, assess your titles and page content, and change your titles to be uniquely relevant to the content on that page. If you have the opportunity as well, do a content audit to see if pages can be condensed or expanded.
Hope this all helps! Good luck!