Is using JavaScript injected text in line with best practice on making blocks of text non-crawlable?
-
I have an ecommerce website that has common text on all the product pages, e.g. delivery and returns information.
Is it ok to use non-crawlable JavaScript injected text as a method to make this content invisible to search engines? Or is this method frowned upon by Google?
By way of background info - I'm concerned about duplicate/thin content, so want to tackle this by reducing this 'common text' as well as boosting unique content on these pages.
Any advice would be much appreciated.
-
I haven't found standard shipping info text to be much of a problem in terms of duplicate content on product pages, but if you're concerned about it I would consider an iframe.
While hiding the content in a non-crawlable .js file might work, it isn't advisable. First, Google PreviewBot can parse javascript on the page and see the content in many cases, as is necessary to generate the site preview and visual cache of a page. Second, anything that keeps Google from being able to properly render the page may be considered suspect.
Please view this video by Matt Cutts: http://www.youtube.com/watch?v=B9BWbruCiDc
Essentially, he says "Don't block Google bot from css or javascript".