If you've never done it before then it's probably a good idea to get the word out.
Although if you're going to be doing a lot of this in the future then you might want to start building up your own social media accounts.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
If you've never done it before then it's probably a good idea to get the word out.
Although if you're going to be doing a lot of this in the future then you might want to start building up your own social media accounts.
Actually on second thoughts - YES. Yes it probably is the reason your terms are dropping.
Could be.
That's a directive that tells search engines no to include that page in their indexes.
If you're paginating correctly then there shouldn't be a problem but you probably want to add some indication of the page number into the title for clarity for users.There's an old post by Rand about it here http://www.seomoz.org/blog/pagination-best-practices-for-seo-user-experience
Although, as you're loading in the new results with AJAX I'm not sure if this complicates matters or simplifies them. I wouldn't foresee there being any problems but haven't experienced it first hand.
If you look underneath the crawl diagnostics chart there's a text box with the date of the last crawl and the scheduled date of the next crawl.
It's about every 7-10 days I think
So I'm going to guess launching a new domain isn't an option?
You can use Excel to segment the linking root domains (LRDs) by those that are probably directories - those that contain terms like "directory", "URL" and "submit" in the domain. There are a couple of ways to do it but just as an exmaple if you enter the following into Excel it will tell you whether the URL in A2 contains the term "directory".
=isnumber(search("directory",A2))
It pays to scan over these links and see if there are any decent sites that you've bundled in with the bad but mostly I'd imagine they will just be directories.
You can do the same thing with adult terminology to find those types of sites. The method is by no means perfect but it is quick and easy.
Possibly the easiest way to spot low quality links is looking at the C-block. If you've got 50 links from the same C class IP then it's either a bad link network or a blog service. You'll spot the difference pretty quickly although if you have several clients wanting this service then it might be worth excluding the IP ranges of the major blog services in your spread sheet. I don't think you can currently get IP from OSE but I'm sure you'll figure out a way.
That's all that I can think of off the top of my head but I'll add more if I think of them.
I'd work at him from the angle that it is possible to be both professional and personable. If you pull out editorial from a high quality newspaper or even cite some influential blogs in a space he follows you should be able to put that point to him pretty convincingly.
Alternatively I think Will Critchlow recommends hiring people on Amazon Turk to answer a Panda-style questionnaire. You could compare the current text with that written in the first person and see which one real people prefer. Or pitch the two versions of the text head to head in an A/B test and see which has higher user engagement statistics.
If you're only doing the blog to generate links/shares etc. and he insists on using his own writing style then it sounds like it would just be a waste of his time.
If I had the choice I'd go for Facebook Comments purely for the the social sharing.
Hi Sal,
If you look for the referrer column in the report you can see which pages are linking to the broken URLs.
Fix these broken links and you won't be generating so many 4xx pages.
That's the theory anyway. It can be a pretty arduous task but if you stick to it you should be able to get that number down.
Hi Jay,
Sorry to hear it's hurting your business so much.
Have you double checked the dates of your decrease in traffic against the Penguin update? There were a lot of big changes going on around that time so it's worth being sure it was Penguin.
In answer to question 3 - If they're external sites then I don't think those 1700 404s are having a negative effect on your SEO. If those directories are hurting you at all through the Penguin update then it would be through over-optimised anchor text (although I haven't seen any definitive data on this).
In answer to question 2 - Would I be right in thinking that you're using a 301 or a 302 to send users to a generic error page? However you're generating soft 404s the best fix is to make them real 404 errors so the server returns a 404 code. The details of setting up a custom 404 page are pretty well documented around the web so you shouldn't have much problem with it.
In answer to question 1 - Have you tried checking to see if Google has re-cached your pages since the change? It's also probably worth looking at the rel=prev rel=next markup as well. Maile Ohye from Google has released a pretty comprehensive video on the topic of pagination and SEO so I'd recommend checking that out.
Question 2 first - I don't think there's any need for the no-follow at all. In fact I'm pretty certain Google have gone on record saying that you'll never need to no-follow with internal links.
Question 1 - If you've got some unique content in the main body of the page I wouldn't foresee it being a problem. However you should probably be asking yourself if it's actually of any use to the user? Is the content you're sending them to relevant to what they're looking for and is it improving their overall experience on your site?
Without access to your analytics data I'd struggle to say exactly what's causing it but after a quick look I'd recommend you have a deeper look into the following.
Panda - you've got a lot of ads above the fold on your articles and in places some of your sentence structure is a bit shaky. It might be that you've been caught out by one of the more recent Panda updates.
Links - you seem to have quite a few links from comments forms to your articles. It's probably worth doing some guest posting to get in context editorial links. Also it's worth remembering that Google have been cracking down on spammy link building a lot recently so even if you haven't been building spammy links it is possible that some of the sites linking to you were using spammy techniques and are passing less power now.
And as a bonus you can explain to them how they're going to need to do some linkbuilding
But not too optimal right? 
I don't have any suggestions apart from test your new variations and see. Although it sounds like introducing stricter naming conventions might help as well.
How did you decide that it was content causing the issue if only 3/10 of your sites were affected?
Also when you added the rel=canonical did 9 of your sites point to a primary site and was this the site that recovered?
Generally speaking most of the data SEOMoz has about your campaigns is stuff that a determined competitor could find out anyway (and probably for cheaper than bribing Rand).
Plus if Matt Cutts wanted to get data on you he wouldn't have to go through a 3rd party - he'd just send around a streetview car 
Do your title tags also determine how you link internally and are they inserted dynamically into content on the page?
If not then (and let me know if I'm being dim-witted here) I don't see anything massively wrong with the first set of titles. Could you roll back and do a quick test to see if it works?
I'd ask that question again when people have a better understanding of the Penguin update.
Although with all of the work Google has done over the past few years to push brand and devalue exact match domains I would personally invest in something other than redirecting my domain to a genericKW dot net. This is just a philosophical point though - I don't have hard facts to back it up.
There are a multitude of reasons why your new domain may not be ranking above your retail stores.The age of the site is probably 1 factor amongst many.
If I were you I'd give it a month or so to settle down before panicking too much.
If it's mission critical that the new site rank #1 then I'd consider using PPC to get the visibility while you do some work on it.
If it's on that list then it's already live.
You'll have to wait for other changes although I probably should mention that Google might not see a problem with the SERPs you're looking at. For example if you're trying to compete on a competitor's brand name then you might be better off bidding on PPC rather than waiting for Google to decide that they don't belong at #1, #2 and #3.