You can try to 301 redirect them elsewhere, but the filter will probably get passed along eventually as well.The surest bet is to 404 them but leave the content there (ie: deliver a 404 status code but still allow people to get to that page's content). This way your users can still see the content, especially those coming from outside links that you can't control. However, this is not very elegant. I think Matt Cutts should chime in on this one, but I doubt he is a paying subscriber 
Posts made by HiveDigitalInc
-
RE: Handful of internal pages penguin penalized. 302 them or let them 404?
-
RE: Apache redirects
Perhaps you should simply say if it is NOT the new site, then redirect to the new site, rather than if it IS the old site, then redirect to the new site...
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.newsite.com$ [NC]
RewriteRule ^(.*)$ http://www.newsite.com/$ [L,R=301]This way you also handle the non-www to www redirect.
There is about an 80% chance something is wrong in that code, but let's pretend it is perfect. k thx.
-
RE: Avoiding the penguin slap in the face.
I agree that it certainly should be discussed in advance, but to me it seems petty. It is as if you are saying to the client that their money wasn't enough.
-
RE: Avoiding the penguin slap in the face.
Nofollowing them is fine, but honestly I think that links back are very tacky. You can always take credit via comments in the code, but an exposed link just doesn't make much sense. Advertising companies don't put their names on commercials, construction companies don't leave their names on your house, and web developers shouldn't leave their links on their clients' sites.
-
RE: Separate Site or should we incorporate it into our main site
The big opportunity would be to get all 3 sites to rank for primary keywords like personal development. Of course, this strategy will require 3 times the work

My guess is that the trainers are not prepared to do the work to get each individual site to rank, so it probably makes sense to join them all together. If, in a few months, you find out that you are producing tons of content, you can always peel off the sites with 301 redirects.
-
RE: Multiple Listings in Results fading Local SEO
Actually, in many other spaces we are seeing extreme host crowding, where you might get 8 or more listings - here was one pointed out by Brett Tabke of Wembaster World
-
RE: Search Google Plus Profiles
site:plus.google.com "Lived in $city, $state" inurl:about
Scrape the results then extract the keywords.
-
RE: How do you get a Google+ pic in your SERP snippet
Many ways to do this right, but the most straightforward is this...
1. Create an about page on your site. From that page, link to your Google+ profile with a "rel=me" link.
2. On all your other pages, link to your about page with a "rel=author" link.
3. Add your site to the "contributor to" section on Google+
4. Celebrate that you now are participating in the largest reciprocal link building scheme since the internet was started. Who knew Google was allowed to violate its own Webmaster Guidelines?
-
RE: Check large number of websites for a specific keyword (or its synonym).
Are you checking to see the word is just on the homepage or any page on the entire site? There isn't really a way to do this in SEOMoz, but depending on the number of pages there are a variety of solutions out there. If it is large enough a list, you can use a scraper for hire like AuthorityLabs or 80Legs
-
RE: SEO basics for Q&A tool
Automation Automation Automation
1. Noindexing Thin Content: Perfectly fine strategy, but it should be easy to automate. Just have a combined word count of the question and the answers provided. If that word count is greater than X, then remove the noindex tag
2. Finding All Content: Good interlinking is all you need. First, you should have a page that lists all the latest threads that DO NOT have noindex tags on them. Second, you should interlink them well. If you have programmers, they might want to look into Yahoo Query Language (YQL) Term Extraction API. If you feed your QA content through this, it will find relevant words. With this, you can then create tags, and then you can show "related questions" based on those tags. This additional interlinking will help a lot.
3. Optimizing the Content: No. Users write questions the way they search for questions in Google. I would let the users control the content, and you just edit it for appropriateness.
-
RE: What Is The Deal Between Indeed and Google?
Google's host-crowding features seem very out of whack, sometimes causing a single site to control 80% of the first page. I assume this will be fixed at some point by Google, but for the moment you just have to deal with it and try to beat them out.
-
RE: Need advice on having customer stores running on my subdomain
First off, the biggest concern will be spammers. Use a good captcha system like reCaptcha or OpenCaptcha to control who signs up, although charging money is the best way to keep them out.
Second, don't manipulate the anchor text at the bottom of the user generated stores, just use your brand name.
Third, I wouldn't be hugely concerned about it being a subdomain. Think about all the spam on blogspot and wordpress.com. You won't enjoy the authority status at the beginning that lends them some buffer against the spam, but if you control it at the beginning you should build up your site enough that you can weather some crap making it's way through.
-
RE: Improving data tables for Usability & SEO
I actually prefer the list, but that might just be me. It makes the page feel authoritative because all that content is exposed. You may just want to make all the data exposed at first and a drop down allows you to hide everything but the specific content, so users can avoid scrolling if they choose.
-
RE: Google Reconsideration Request Without Warning Message First?
Rand recommends a pre-emptive reconsideration request if you really don't know the source of the bad tactics.
http://www.seomoz.org/blog/negative-seo-myths-realities-and-precautions-whiteboard-friday
-
RE: On blog structure and topic [Advanced]
This is a difficult question because the buy-in of the authors (faculty) matters. What you really want is that each blog created is regularly updated with good, fresh, vibrant content. If only 1 of the 20 faculty members is actually going to blog regularly, you will probably want to create a single blog and have them post to various categories. Otherwise, you will end up with one regularly updated blog and a ton of dead ones.
If you have good buy in, I would create blogs on a category basis. It is highly unlikely that these blogs will be accessed by people looking for the faculty members specifically, so they don't each need their own blog.
-
RE: Did anyone Rankings drop massively last weekend ...Is this new google update ?
Did you lose your rankings on 1 particular day or did it happen over a few? Did you get an unnatural links notice in Google Webmaster Tools? Did you lose your rankings for all keywords or just a few you had been working on? Did you get knocked back to page 2 or just lose some rankings but remained on Page 1?
Sorry, but all these questions are a part of the diagnosis of what caused the rankings loss.
-
RE: Pagination Question: Google's 'rel=prev & rel=next' vs Javascript Re-fresh
It is unlikely that you will see a great improvement in traffic. The real reason you should probably move to this is usability. Individuals might want to bookmark a specific page of the results. Individuals might not have Javascript enabled, or might have browsers that are not compatible with the refresh method. My vote would be to move to rel=prev rel=next
-
RE: Low Domain Authority - Rank Well For Competitive Keywords
Why don't you run SEOMoz's keyword competitiveness tools to get an idea of what makes them better. Also, take a look at some competitor data sets like Majestic and AHrefs - I always start with SEOMoz, but there is no reason to limit yourself to 1 data set. Chances are they have a bunch more spammy links that you aren't yet seeing, and those are giving them the lift.