Hi Matt,
This is also happening to me. Each time I enter a new domain to research, OSE requires that I log in again. I have filed a help ticket with moz and will update when I hear back from them.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Matt,
This is also happening to me. Each time I enter a new domain to research, OSE requires that I log in again. I have filed a help ticket with moz and will update when I hear back from them.
Hi Jack,
That is normal for the keyword rankings not to match up properly and is a pretty common occurrence due to personalized search, etc. In the top, right corner of the Keyword Analysis tool you actually get to select the search engine and country that you want to use to pull SERP data from.
You might want to read further about the tool on the help hub: http://www.seomoz.org/help/keyword-analysis
Hope that helps and good luck!
Great point Karl! Added incentive to make sure our client's backlinks are authoritative and squeaky clean. 
Hey Laura,
SEOmoz uses data from OSE for its DA, PA, and other link metrics. Since SEOmoz only updates the index every four weeks (lately it has been more like every two weeks), the data is a little bit delayed and not instantaneous. In addition, the actual crawl takes two to three weeks to complete meaning the data will not necessarily be included in the next update. Basically, the link should show up in OSE and impact your DA/PA within six weeks or so. One thing to keep in mind -- the delay in OSE will not impact the timeframe in which Google considers the link for rankings.
I would definitely spend the next couple of weeks building additional high-quality links and cleaning up the backlink profile. It is rare for one authority link to 'undo' all of the problems that a spammy backlink profile has caused.
We read today's rand(om) question on responsive design. This is a topic we have been thinking about and ultimately landing on a different solution. Our opinion is the best user experience is two version (desktop and mobile) that live on one URL.
For example, a non-mobile visitor that visits http://www.tripadvisor.com/ will see the desktop (non-responsive) version. However, if a mobile visitor (i.e. iOS) visits the same URL they will see a mobile version of the site, but it is still on the same URL There is not a separate subdomain or URL - instead the page dynamically changes based on the end user's user agent.
It looks like they are accomplishing this by using javascript to change the physical layout of the page to match the user's device. This is what we are considering doing for our site.
It seems this would simultaneously solve the problems mentioned in the rand(om) question and provide an even better user experience. By using this method, we can create a truly mobile version of the website that is similar to an app. Unfortunately, mobile versions and desktop users have very different expectations and behaviors while interacting with a webpage.
I'm interested to hear the negative side of developing two versions of the site and using javascript to serve the "right" version on the same URL. Thanks for your time!
The backlinks from the duplicate content press releases that were reposted by a network of news sites will not help you. That is unnatural duplicate content with the lack of natural anchor text variances. The recent Penguin and Panda updates targeted this exact sort of thing.
Now, if you can get real editorial links from reporters that makes the whole thing worth while! 
The short answer is those links are useless and will not benefit your company in any meaningful way.
Press releases by themselves should not be considered a good link building tactic. Press release ARE effective when you use them to promote something you have done that is newsworthy. In this case, the press release is a hook to catch a journalist and hopefully get positive PR and interviews that link back to you. It is not the link from the press release that helps you, but the editorial link from a highly authoritative site that helps. Unless you have something exciting enough to share with a friend at a bar, you probably should not be using a press release for it.
If you would like to read additionally on this, check out this article:
http://searchengineland.com/how-prweb-helps-distribute-crap-into-google-news-sites-140597
Hi Raphael,
I agree with the previous assessment. However, I would add a couple of other things to keep in mind:
I hope this helps and good luck to you!
Hi Rand,
His portfolio is here: http://www.design.evasilev.com/
Overall, his work is good. However, the amount of hours you are willing to allow will determine how detailed/customized he can get. In addition, definitely have all of your information in a thorough project brief so you don't run up charges clarifying the data.
You may also be interested in a recent post by Jon Cooper on infographics: http://pointblankseo.com/visualizations He has good tips on creating one on a budget, finding data, and promotion.
Good luck to you!
I have used an oDesk provider named Zhenia Vasiliev to have high-quality infographics produced in the price range of $500 to $800. He lives in the UK so communication is easy, but he can be a little late when it comes to deadlines. I would steer clear of most of the cheap designers on the freelance sites though -- a lot of them are just using programs like piktochart and communication can be a headache.
Regardless of what option you decide on, make sure that the infographic presents awesome information in a unique way that your target user base will enjoy. Right now the Internet is flooded with crappy infographics and there is nothing inherently magical about them. Good luck!
Thanks. I figured this was the case, but was not sure if I was missing any "best practices" about getting the previously blocked URL included faster.
In the past, I had blocked a section of my site (i.e. domain.com/store/) by placing the following in my robots.txt file: "Disallow: /store/" Now, I would like the store to be indexed and included in the search results. I have removed the "Disallow: /store/" from the robots.txt file, but approximately one week later a Google search for the URL produces the following meta description in the search results: "A description for this result is not available because of this site's robots.txt – learn more"
Is there anything else I need to do to speed up the process of getting this section of the site indexed?