Probably a coincidence.. rankings do fluctuate. If you were hit by Emmanuel, you probably wouldn't be on the front page, let alone #3.
In additional, your competitors could have just improved their rankings (and as a result, your's dropped)
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Probably a coincidence.. rankings do fluctuate. If you were hit by Emmanuel, you probably wouldn't be on the front page, let alone #3.
In additional, your competitors could have just improved their rankings (and as a result, your's dropped)
Yes, you probably wont do too well in rankings if you just have a single number/name difference for many pages. Maybe add in more unique information related to that person? e.g. where you got your information, where you got information from, last 3 tweets/status updates, etc. Although this is all aggregate content (which Google still considers inferior to unique content) it should help you rank in addition to giving your users more useful information.
I wouldn't say there is a limit but you if you are writing about a limited topic, the more you write, the less focused your content may become. Although it will improve long tail, could hurt the target phrase you are trying to rank for.
In addition, if you just start repeating yourself over and over again, it wont do you any favors. You are better off writing more succinctly with high quality so that people looking to solve a specific problem can visit that page and find a solution (and hopefully link back to it!). If its too long, it might be harder for a person to find what they are looking for.
Not sure if that would work, but you can test by changing your robots.txt and running a test in GWT > Health > Blocked URLs
You might also be interested in specifying specific URL paraments (e.g. /?sort=name&order=asc > can block sort and order parameters) from within GWT (Configuration > URL Parameters)
Learn more about parameters - https://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
Yes, that is correct.
Once you have it set up, you can test what pages are blocked in your GWT account. Health > Blocked URLs > select which bot you want to test at the bottom of the page.
Resources:
Google's bots - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1061943
WMT Blocked URL testing - https://www.google.com/webmasters/tools/robots-analysis
Might have to do with canonical urls.. pretty sure SEOMoz report doesn't take that into account when they tell you which pages are duplicate.
Best thing to do is check out what pages are getting picked up by both sources and compare/contrast similarities/differences.
In case you haven't seen it, check out this post
http://www.seomoz.org/blog/the-bigfoot-update-aka-dr-pete-goes-crazy
Talks about domain diversification and ranking fluctuations as of recent.
This has been happening for a while now. We use rank tracking with 100 results per page and the ranking results are usually different than comparing it to 10 results per page.E.g. 10 results per page ranking = 19, 100 results per page ranking = 33
First time we noticed this was back in March 12'.. although it could have started earlier than that.