Good question. I noticed this and thought it was just me! 
I'm sure the SEOmoz team are ninjaring the problem right now. 
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Good question. I noticed this and thought it was just me! 
I'm sure the SEOmoz team are ninjaring the problem right now. 
I like your second screenshot with the links to the recent issues that persist over all the pages, and think keeping them separated by year makes them easy to navigate for users and easy to crawl for search engines.
I would comment, whilst you are restructuring, that your content is very deep. It seems everything on your site exists at least 2 folders deep:
/online/en/home.html
Given your site seems to be only available in English I don't see the necessity of the 'online' or the 'en' directories.
Secondly, the publications:
/online/en/home/publications/journals/afp/afpsearch.html
Here having 'publications' and 'journals' seems superfluous, and now a 'home' directory has also appeared. Your content is now 6-7 levels down! I'm unsure of what CMS / software is powering the site, but I'm sure this could be changed. So you could have much cleaner, which is much easier:
/journals/afp/search.html
instead of:
/online/en/home/publications/journals/afp/afpsearch.html
Which I think would be much nicer for users and engines. 
Best of luck! 
I'd suggest that rel canonical is the perfect way to handle this. What seems to be the problem with this approach that makes you want to chance it?
Well, it depends! Use of the rel=canonical tag sounds fine to me, I agree they'll feel more 'at home' with a .de domain.
However, what is the strategy for the German users to find the .de version of the site? I'm guessing AdWords or similar where you display the .de domain to increase the CTR. However, it would be nice to be able to bring users in via the organic rankings.
Seeing as this is a valid problem faced by many, it is a shame the engines haven't yet provided a better way to handle it.
Viel Glück! 
Chalk me down for Cutts too.
However, I'd have to also add Larry and Sergey to the list. 
Hi!
The problem you have is that you are using two different Apache extensions and I don't believe you can control the order the go in. You are using mod_alias and mod_rewrite.
It sounds like you have a line like:
Redirect permanent /news/dinosaur/ninja http://www.oursite.com/blog/ninja
This is the mod_alias rule, and it is firing before your mod_rewrite rules (and then it no longer fits their criteria, so they don't fire).
So, what you need to do is change your mod_alias rules into mod_rewrite rules, so then you can control the order (they execute top to bottom, so you just put your 'specific' redirects above your general ones). You rule should look something like this (I've not confirmed!):
RewriteRule ^/news/dinosaur/ninja/$ /blog/ninja/ [L,R=301]
I hope all that make sense. Let me know how you get on!
I would differ slightly in my approach. If you robots.txt block the other sites, then any organic links they build up will be worthless.
Rel canonical across domains should be fine, so put that in place. Then meta noindex, follow: this way the juice flows in at least. Make sure the rel=canonicals go to the same specific page on the duplicate, obviously.
I'm not sure if anyone has any solid data on this, and even if they did I think it would be so conditioned on many other factors that it wouldn't be much help.
The pro's of allowing dofollow links from commenters on your blog is obviously that it encourages people to get involved, and maybe they stick around and become part of the community. The downside is the spammers who will invade!
You can argue both sides...
If people are going to bother to comment with good quality comments which are useful to building a community, and for they themselves to be persuaded to stick around is going to require that you are creating great content. If the content is crappy then people won't bother crafting great replies. So if the quality is crappy and you allow dofollow then you'll get (other than the spammers...) people who are maybe nice but aren't really interested in your community, they just want a link.
if the content is great then do you need the dofollow to persuade people to comment?
On the flipside, if the content is pretty good maybe a dofollow link helps to push people over the edge and make them bother to write the comment... If the community is small you should be able to manage the spam.
At the end of the day - I think you need to test in your niche and see what happens... Good luck! 
My suggestion: redirect mobile browsers to the HTML version as well as for the search bots. If the browser is not a search bot or a mobile device, and has Flash installed then redirect to the Flash version (not using a 301!) with a parameter to take the person to the same content on the Flash site (in the swf you can read the parameter and load the relevant part). Mobile users or search bots arriving via those links with parameters can be 301'd to the HTML version.
Now mobile users and search bot get a nice HTML version. Flash users get the 'rich' experience. Search engines can crawl all the content nicely. Furthermore, it would stand up to a manual review.
Many more details in answer to a similar question here:
http://www.seomoz.org/q/converse-com-flash-and-html-version-of-site-bad-idea

There are 2 aspects to consider when you discuss gaming it. The first is the effect it has on the SERPs for everybody, the second is for how it affects it for social users and what the effect on CTR is.
For the 1st, the answer is - the same way they do with links. Check for cliques, for lots of unused accounts +1'ining the same sites, etc. etc. We'll see how effective that is.
For the 2nd, one important fact is going to be you can't game people's networks. You can create a bunch of bogus accounts, and create +1 groups for circular +1'ing. However, none of that matters if you aren't in people's networks. Currently it seems that +1's only show up if at least one friend/contact has +1'd the site. Then you see the total count (see attached image), otherwise at it stands you aren't going to affect people's CTR (which itself will be a signal).
I agree about Google's previous efforts, however I think +1 is finally going to be their breakthrough. They are going to push it hard, and unlike the other efforts, there is no real effort from the user to learn about it. I mean - most people never bothered to try to understand Google Buzz, whereas the concept of a 'like' button is really simple and widely known.
Currently, you cannot embed them into your website (it is coming, but not for 'months'), so that aspect isn't yet in play.
If you consider this - Google know that social is becoming increasingly important and is going to keeping growing. Furthermore, they seemed to have reached a limit with using links to determine what results to show, and are in need of new signals.
Everyone knows social signals are becoming more important, and Google really doesn't want to be relying only on signals under the control of others. I think +1 is here to stay, in one form or another.
However, currently you can only +1 a site from the SERPs, so it makes it hard for companies going to leverage this. Yet, I'm sure people will find a way.
So Google just announced the +1 button, which is there answer to Facebook 'likes' button:
http://searchengineland.com/meet-1-googles-answer-to-the-facebook-like-button-70569
http://techcrunch.com/2011/03/30/google-plus-one/
They should begin showing up now on Google US. If you aren't seeing them, you can turn them on here:
http://www.google.com/experimental/
I want to know? What affect do you predict these are going to have? Is there anything we can do from an SEO point to begin leveraging +1 right from the outset?
Hi Susan,
We can help you through it, but we need to know specific bits you are stuck on. 
It looks like you are already getting some bits right, but other bits quite wrong. Your page titles need to be relevant to the content, which sometimes isn't the case. Your sitemap page has many many links on it, too many for search engines, and probably too many to be helpful to humans too. These little things will add up.
The important thing is to relax and take it one step at a time!
-Tom
Great question!
Firstly - unfortunately, Steve's suggestion isn't going to be viable for you. The # portion of the URL is not available to your code server-side, so you won't be able to determine where the rel canonical should point.
Furthermore, if they are committed to keeping the flash for now, and all as a single unit so one URL (the homepage), then you are going to have to accept that some juice intended for subpages is going to go to the homepage. You cannot do anything about that aspect, so you need to focus on the rest of the problem. However, whilst far from ideal, at least the juice is hitting the site somehow.
So… what to do?
Firstly, I'd start getting into the mindset of thinking in terms of the HTML site as the main/canonical site, and the Flash site as the 'enhanced experience' version. In this way, the HTML version is going to be the version that should be crawled by Google, and should be linked to.
Actions:
Is this cloaking? No! The HTML version is the main version, remember? It's no more cloaking than if you detected the user agent and then chose to serve the Flash version to Googlebot.
I actually discussed this with Jane Copeland at the fantastic Distilled link building event a couple of weeks back, and she agreed with me and said if it would stand up to a manual inspection then it is the right course of action.
Get all links in articles, press releases, directories or whatever else that are linking to specific pages and are originating from in house (or any source you have control over) to link to the HTML pages.
If the user arrives, has Flash and has arrived to an HTML link, you can now redirect to the Flash link for that page so they get the 'enhanced experience'. Don't use a 301 redirect -- remember the HTML version is the main version!
If the user arrives via a Flash link, but doesn't have Flash, but does have javascript you can detect the # variable and redirect them to the HTML page to help them along.
Educate the relevant stakeholders regarding point 2. I see you have a 'flashmode=0' option, tell them about this and how to use it get the URLs they need.
So where does this leave us?
The search engines can crawl all your lovely content, and they can ignore the flash version completely.
You are getting inbound links to specific pages. These pages have their own titles and meta descriptions… and content! Because they are the real site!
Users with Flash arriving via these links are landing on the correct Flash page of the site and are experiencing the rich site that you want them to.
Users arriving without Flash are getting the correct page if they arrive via an HTML URL. If they arrive via a Flash url then they get the correct page if they have javascript on (e.g iPad users), or they get the fallback of the homepage (rare).
I had a client with an almost identical situation, and I rolled out an almost identical solution to this, and they got crawled very quickly, shot up in Google and have stayed there for months.
Hope it helps. Let us know how you get on! 
For the amount of effort you spend on such a scheme, you could do a lot better by just trying to think outside the box, and thinking more creatively than the others in your niche. When you are forced to think of a solution that is purely white hat that can achieve or outperform a grey/black approach, you often come up with a solution that whilst maybe not so quick is going to be more sustainable and add value to your users.
I encourage you to think about what would be really awesome content/tools/linkbait for your industry, and what hasn't been done or done well (or not been updated for sometime - this is a good one), etc etc. Then do that. You will not be disappointed you bothered. 
I've not heard of the spike lasting a couple of months before - more like a few days. Maybe the factor that you mentioned, that they are low-traffic keywords, played into this. Interesting.
I agree that you should be able to upload a big image, but unfortunately most of the libraries I've seen for doing backend image manipulation don't do a great job scaling down large images. They often end up blurry.
Whereas a good graphics program will sharpen properly and you'll end up with a much better looking image. 
Would be interesting to see if yours looks better if you upload a pre-scaled one. 