A lot of people are having issues with the SEOmoz rank tool at the moment.
Double check your rankings manually and if they're fine assume it's the tool.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
A lot of people are having issues with the SEOmoz rank tool at the moment.
Double check your rankings manually and if they're fine assume it's the tool.
Already a lot of good info out there if you search, either use the forum search or go here - http://www.google.co.uk/search?q=seomoz+plural&pws=0&hl=en&num=10 - (the .co.uk results are better than the .com results).
Not seen it in action, but I like it. I like it a lot 
First thoughts are that it'll never been usable on a massive scale (I'd be worried about security and people gaming the system for points if the rewards are any good), but for smaller sites I think it's very clever.
You've made it upside down 
Roger sees the first * and then goes "okay :(" and goes away.
Simply change it to:
User-agent: rogerbot
Disallow:
User-agent: *
Disallow: /
Keyword density?
ಠ_ಠ
http://www.seomoz.org/ugc/seo-myths-that-persist-keyword-density
http://www.seomoz.org/blog/some-opinions-on-the-seo-myths-realities-fight
http://www.miislita.com/fractals/keyword-density-optimization.html
http://www.seomoz.org/blog/perfecting-keyword-targeting-on-page-optimization
My vote's for headspace2.
I like that I can add js and css on the fly to specific pages with it easily.
Make sure instant search is off though. I think the URLs mess up if that's on.
I don't use instant much though, I hate it 
It disables personalised search.
Eh, what?
You can set goals and funnels in GA and also craft URLs with the utm tags for different channels.
If you can elaborate on your question I may be able to help further. I'm not sure what you're trying to achieve.
Have you tried adding &pws=0 to the end of any search query. Does that make a difference?
Even though you're logged out, Google knows 
robots.txt isn't a requirement, indeed it's only voluntarily followed by spiders (as in they can choose to ignore it), so I think you'll be fine without it. The default is to 'allow all' and 'follow, index', so they should still be crawling the site correctly.
Check in Webmaster tools by fetching as Googlebot or alternative find a page and put cache:pageurl.html into google and see if it's cached it correctly.
That said returning a 500 instead of a 404 may be causing an issue that isn't obviously apparent and 500 is a bit too generic a message to say specifically what, but I would try and solve it as quick as possible. The benefits will depends on what you put in your robots.txt file 
How are they Reddit pyjamas? There's no up or down arrows 
Pictures will almost only do well if it's a comic or hosted on imgur.
AmAs are good because of the reason you mentioned. If it's a popular AmA then you can get a lot of traffic.
News stories that make the front page can send hundreds of thousands of visits, but Redditors hate marketers with a passion, so you have to have a genuine story. Posting any old nonsense will just get you downboats.
This is a reasonable read on how it can be used for SEO - http://www.wolf-howl.com/seo/how-mturk-protects-my-site-from-panda/
There are also a whole bunch of other legit and (as you say) semi-legit reasons, from the +1/like/retweet suggested by the walrus to UX testing.
Dodgy stuff could be negative reviews for competitors or positive reviews for you on various sites.
Really you're only limited by whatever you can come up with.
Only thing I can think of is Dr Pete's experiment - http://www.seomoz.org/blog/catastrophic-canonicalization - but that's not quite the same as the example you're proposing.
I think you should still get a good idea of what happens after removing canonical tags though.
397 links on the homepage :s
Well, it's definitely something to think about primarily because of how PR flows through links. You may want to consider consolidation for some pages - http://www.seomoz.org/blog/link-consolidation-the-new-pagerank-sculpting
It's not really a metric that you should worry too much about unless you're going overboard with it (and while I've no idea what counts as overboard, 400 is a lot) but check out these if you're worried
http://www.mattcutts.com/blog/how-many-links-per-page/
http://www.youtube.com/watch?v=l6g5hoBYlf0
Unfortunately, with no hard-fast rule, it's down to you as to how much you think this may effect you.
It's much more likely that you're logged into a Google profile on one browser and not on the other, so your results are being personalised to a degree.
Could it be this or do I need to think about it more? 
Possibly months, but more likely (if they're good quality links on high-ish profile sites and not buried in a directory somewhere) it'll be at least a month.
The last index update was August 23rd (check on the OSE homepage to see), so I would expect another update around the end of September or thereabouts.
What do you mean? Just a character limiter that takes the first 165 characters of the post and adds it a meta description? Or one that uses the excerpt? Or something else?
All in One SEO does auto descriptions, not sure how they're generated though - http://wordpress.org/extend/plugins/all-in-one-seo-pack/
Tildes are okay these days, but 'unsafe'.
http://www.cs.tut.fi/~jkorpela/rfc/2396/full.html#2.3
Tilde was (begrudgingly) added to the unreserved character list a while ago, so Google should treat them fine without encoding.
However, if you can avoid using them I would, so leave the old addresses but from now on I'd use a hyphen (in preference to an underscore, still) instead of a tilde if you can.
Yup, Google should just grab a piece it thinks is relevant.
Additionally if your meta description doesn't match what the user is searching for (but other content on you page matches) Google will sometimes show that instead anyway.