Are there detrimental effects of having multiple robot tags
-
Hi All,
I came across some pages on our site that have multiple robot tags, but they have the same directives. Two are identical while one is for Google only. I know there aren't any real benefits from having it set up this way, but are there any detrimental effects such as slowing down the bots crawling these pages?
name="googlebot" content="index, follow, noodp"/>
Thanks!
-
Hi there.
There shouldn't be any difference, only extra bytes to load. The first rule will be what will tell bots what to do. It's simply redundant rules.
Hope this helps.
-
Like Dmitri mentioned, it shouldn't really matter it's redundancy. In the last case it could sometimes be useful for bigger sites to make statement or tags specific to certain engines to keep their behaviours from being different.