Questions
-
Duplicate Content & Tags
If the only option is to disallow via the robots.txt, then I would agree with your setup - disallow the slugs specific to the tags you don't want indexed. I've heard shopify is a little rough to work with sometimes because of the limitations, so whatever you can do I think is better than nothing. Remember that the robots exclusion is treated as a suggestion and not a command, so if it's possible to assign a no-index meta tag to those URL types that would be best case. Looks like you're on the right track with the post below: { % if handle contains "tagged" % } { % endif % } The one suggestion I would make is that you use noindex,follow so the content will still be crawled, but the duplicate tag won't get indexed. That would create multiple paths to the content on your site, but not create an index bloat issue with multiple tags.
Content & Blogging | | Eric_Rohrback0