Potential issue: Page design might look like keyword stuffing to a web crawler
-
We have an interesting design element we might try on our home page. Here's a mockup: https://codepen.io/dsbudiac/pen/Bwrgjd
I'm worried web crawlers will interpret this as keyword stuffing and affect our rankings. It features:
- Mostly transparent/hidden text
- Repeating keyword list
I could try a couple methods to skirt around crawling concerns:
- Load keywords through an iframe
- Make the keywords an image (would significantly increase page load)
- Inject keywords after page load into a container w/ javascript (prob not effective as crawlers are only getting better at indexing javascript)
- Load the keywords into an svg element
- Load the keywords into a canvas element via javascript
I have a few questions:
- Should I be concerned about any potential keyword stuffing / SEO issues with this design?
- Can you comment on the effectiveness (with proof) of the above strategies?
- Am I better off just abandoning this type of design?
-
This post is deleted! -
I thought about using googleon/googleoff tags, but apparently that's only for Google Search Appliance, and not traditional google search/index: https://webmasters.stackexchange.com/questions/54735/can-you-use-googleon-and-googleoff-comments-to-prevent-googlebot-from-indexing-p
-
This post is deleted! -
As long as you don't use that text inside a header, link, or some relevant piece of content you don't have to worry about it. As I understand h1 is the main factor of Google to determine the main keyword of a specific page.
-
Never said the image option was hard. It's just not ideal as it increases page load and is less flexible. A noindex'd iframe seems to be the best option. We already have a working proof of concept, thanks.
-
Ah, a very interesting question!
I'd not be too concerned; you're loading the content in through a data attribute rather than directly as text. However, there are definitely a few options you could consider:
- Render via SVG feels like the safest bet, though that's going to be a pretty large, complex set of vectors.
- Save + serve as an image (and overcome the file size concerns by using WebP, HTTP/2, a CDN like Cloudflare, etc)
- Serve the content via a dedicated JavaScript file, which you could block access to via robots.txt (a bit fudgey!)
I'd be keen to explore #2 - feels like you should be able to achieve the effect you're after with an image which isn't ridiculously huge.