Questions
-
Philosophy & Deep Thoughts On Tag/Category URLs
Hey Mike Great question(s)! 1. Are indexed tag/category pages really a duplicate content problem, and if so, why the heck**?** Since we are getting philosophical - let's define "what is duplicate content"? in the first place. There's two different types really; technical duplicate content - this is the kind we're referring to here. It's not real duplicate content (like you're trying to copy the same article or something over and over, it's not even cross domain). Technical duplicate content is there as a result of a function of the CMS or web development. Like tracking parameters, non-canonical homepages (www, non-www, /index.heml all loading etc), sorting functions on ecommerce sites. actual duplicate content - this is more like when someone has scraped an article from one domain to another, or copied an article on purpose - to actually try and pass it off as "unique" when it's totally copied. Tags & categories sort of cause "technical duplicate content" but not always. It depends how you have WordPress set up. Most commonly, I see them create duplicate content in the sense that a tag archive might look almost exactly the same as the article page its self - or very similar. OR what a lot of people are referring to and don't even realize it (which is a bit of a pet peeve) is the subpages off of tags and categories. When tag and/or category pages paginate (again, depending on how it's set up) the title tags will look like duplicates. ie: /tag/exercise-and-nutrition/ has the title tag: Exercise and Nutrition - Healthblog.com /tag/exercise-and-nutrition/page/2 etc _still has the title tag: _Exercise and Nutrition - Healthblog.com So the question really is - if tag/categories are "technical duplicate content" is THAT type of "duplicate content" an issue. I've heard Google say: NO. John Mueller from Google has said multiple times in Webmaster Central Hangout Help Videos - "Google can distinguish this sort of accidental duplicate from real duplicate content". BUT - not so fast - tags and categories can still be an issue, just NOT because of "duplicate content." It really all depends how you have them set up. 1. I first recommend understanding the distinctions between tags and categories (image from my WordPress article) 2. I do recommend indexation in categories by default in most cases. Not sure where you've also heard to noindex categories. That's IF they are used correctly per #1 above. If you use 5-8 well constructed and chosen categories there should not be a problem with indexing categories. 3. Noindex subpages of archives - this kills 95% of what some folks mistakingly call "duplicate content" and is really just duplicate title tags from the pagination of subpages. 4. I highly advocate leaving some tags indexed (using the Yoast SEO plugin) that are bringing traffic - here's how I do that analysis when de-indexing tags. Here are the REAL issues that tags and subpages CAN create; index bloat - lots of pages getting indexed that fill up the index and distract from what you might prefer to rank for instead poor user metrics from Google results - users tend to bounce off of tag archives, creatig lower user metrics, which can feed back into rankings dilution of content - so while this isn't "duplicate content" is is content dilution: multiple pages that all sort of overlap in topics. 2. Is there a strategy for ranking tag/category pages for news publishing sites ahead of article pages? Totally! Check out Kane's comment on my WordPress post - essentially he is saying to customize your category archives with some unique content on them, as to distinguish them from being posts. Also, only display excerpts of your posts on archive pages. We always cite SugarRae's blog as a great example. Check out her category page here. It has totally unique content at the top, and the posts below. - To conclude, and keep it philosophical I think what you're also getting at here, is an important part of SEO (or anything) that people don't talk about as much - but that's the idea of keeping an open mind, analyzing your specific situation, testing, testing the limits of "rules" - and really applying your own brain. Validate things for yourself. One of the biggest issues, is that most people do not use tags in a deliberate way or really understand how they fully function. They just slap 20 tags on every post (which they think is a magic SEO trick) and end up with thousands of tag pages (I've seen sites with 7,000+ tag archives!) - at the beginning this might not be an issues, but over time if done recklessly like that, it can cause some of the problems noted above. Great question! -Dan
Intermediate & Advanced SEO | | evolvingSEO0 -
"Starting Over" With A New Domain & 301 Redirect
RCN There is no "penalty transfer" per se. It is just that those links will have no value, but you can still get people there. You could use 302's if there is some concern but don't really need to. Remember, you are redirecting a page and that is purposeful. You could also disallow the links to the 301'd pages if you want to be over careful. Again though, I would do the 301's and move on. Best, Robert
Intermediate & Advanced SEO | | RobertFisher0 -
Filling Up Content For A New News Publishing Site
the articles are not evergreen I have a "news" area on one of my sites. Instead of making the huge investment in authorship, I simply link with a comment to articles on other sites. This allows me to provide four, six, eight, ten items per day in just an hour. Writing (at least for me) would take a very long time. If there is enough news in your niche and you can select the valuable stories then your site could become the "go to" location for people to keep up with the news. In my opinion, the human aggregator with some commentary is better than the robots. If you do a good job and allow people to subscribe by RSS and email you might build up an extensive list of industry subscribers. Once you do that the stories will start coming to you instead of you hunting for them.
Content & Blogging | | EGOL0 -
Does Google News Inclusion Affect Organic Rankings?
I cant see why Google would penalise a news site for being indexed in Google News?
Technical SEO Issues | | Rippleffect0 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Definitely deal with the security issues! Good find there... Regarding the client who wants to republish the same article on multiple sites, I think that noindexing it on all but the original site is perfectly fine. Or, alternatively, place a canonical tag on the duplicate sites to let Google know where the true source lies.
Technical SEO Issues | | MarieHaynes0 -
Exponentially Increasing Duplicate Content On Blogs
With Wordpress you can no-index category pages, tags, author pages, archives etc. Most of the SEO plugins let you do this, I know Yoast's plugin does it for sure. Also make sure your site is using canonical tags and you should be good.
On-Page / Site Optimization | | David_ODonnell0 -
Can Linking Between Your Own Sites Excessively Be a Penguin No-No?
Not really. That will simply keep page rank from flowing, but it doesn't fix the root of the problem. It's not necessary to re-write everything. You just need to clean it up. For instance, instead of using: Luxury Resorts Deluxe Resorts Standard Resorts Budget Resorts You simply have a drop down box that says "Resort Type" and then the dropdown says: Luxury Deluxe Standard Budget That one change alleviated 4 "resort" references. Plus it's more user friendly and easier to read. There are many consultants (like us) that can also repair your content if you don't want to do it yourself. It just needs to be natural and the way the verbiage is currently, it's viewed as keyword stuffing. Structure the site to the user and not to Google. Your ultimate goal is a sale and your current sites are kind of tough to read because you've exerted too much energy into pushing keywords into the content. It may take a couple of days per site to scrub out, but it will be worth it in the end.
Intermediate & Advanced SEO | | GeorgiaSEOServices0