Hi David,
I'm actually unsure. Could you send me the URL you're looking at (feel free to private message me if your uncomfortable)
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi David,
I'm actually unsure. Could you send me the URL you're looking at (feel free to private message me if your uncomfortable)
The new Linkscape index is now live! We hit a small snafu rolling it out. Sorry for the mix up.
Oh, by the way.
There are several different reports in the Web App that all update at different times.
1. Your Crawl - Each crawl takes about one week to complete and process the results. Generally, you'll see the results 5-7 days after the beginning of your crawl. If it's been significantly longer than that, contact the help team (help@seomoz.org)
The date of your crawl is dependent on the day you first set up your campaign. This cannot be changed.
2. Rankings - Once per week. Because rankings are checked independent of your crawl, the actual date they update in your campaign may be different than your crawl.
3. Traffic - Once per week. Always Monday though Sunday.
4. Link Analysis - About once per month, following updates in the Linkscape Index
Hi Jorge,
Good question! If I understand correctly, you have a choice between two file structures - is that right? Using option 1, is there anything that would actually live on website.com/utilities/, or is it simply a URL directory with no actual webpage?
If this is the case, both option 1 and 2 would pass the same amount of link juice, assuming you linked directly to each target. That said, it's usually desirable to have:
For this reason, if you are linking directly to each target, I would chose the second option if possible, although the difference it makes probably isn't that great.
Hope this helps! Best of luck with your SEOl
Thanks for you comment. I understand your frustration, especially if you found the marketing message to be confusing. As Keri mentioned, your comments have been shared with the team.
The differences between the size of link indexes has been covered extensively in other places, so I won't rehash it here. One thing I should point out: 24 links only represents the number of links to this specific page/URL. Open Site Explorer shows a total of 156 links pointing to the root domain bannodesignglasgow.com
http://www.opensiteexplorer.org/comparisons?page=1&site=http%3A%2F%2Fwww.bagnodesignglasgow.com
I've added a screenshot below. This represents about 15% of the links listed for your site in GWT, which is on the low end in our experience. Typically it's pretty common for OSE to report between 20-40% of links found in GWT. The good news is OSE tends to index the links that "count" and are more likely to influence rankings. And as Keri mentioned, no index will be able to capture all of the links Google sees. We are working to greatly expand the size of our index, and we should be able to ship some progress on this in the next few months.
Regardless, thanks again.
Hi Mark,
I used to troubleshoot these types of problems (mysteries
when I worked on the SEOmoz help team.
The best thing to do would be contact the Help Team (help@seomoz.org) and include information both your account, url and campaign. They can take this information and see if there is anything odd about your website, or if there is a bug in the crawling software, or finally if there is some strange quirk of incompatibility causing this behavior.
If you would rather, you can PM me with the info and I can try to troubleshoot it myself, but the Help Team has a few more tools and access to engineers, so they might be the better choice. Either way, let us know if you have any trouble.
Hi Rusty,
For a minute I had to shake the cobwebs out of my head to remember what External MozRank was.
For the most part, SEOmoz tools and Open Site Explorer hardly ever display External MozRank. Part of the reason is that the algorithm that calculates MozRank has been better improved over the years. You can still see External MozRank metrics, but they are hard to find. To do so, you need to use the Open Site Intelligence API.
It takes some technical know-how, but once you get the hang of it you can access all kinds of special metrics.
But this is more work than most people really need in thier lives. For the most part, if you concentrate on Page and Domain Authority, you should be just fine.
And we really need to update those descriptions. Thanks for the heads up!
Hi Mikkel,
Like Chris, I spidered your site and couldn't find any links to /index.php files, which probably indicates one of two things:
In the Crawl Errors report in Google Webmaster Tools, if you click on the link of each 404, there's often a "linked from" source where you can see where Google discovered the broken link. This is really helpful in rooting out the cause.
Regardless, I'm going to go with #1 and optimistically believe that you were able to fix the problem. 
Not quite sure what you'd like to know about Domain Authority, but you can find lots of information about it here: https://moz.com/learn/seo/domain-authority
If there is a specific question we can answer, be sure to let us know!
Additionally, don't forget to set the correct search engines. Do you want to track your progress using Google US, or Google UK?
You can change your search engines under the "campaign settings" tab underneath the Campaign Overview. You can select from over 150 operated by Google, Bing and Yahoo around the world.
Ah, I get it now. I'm not a math whiz, but the exponential factor is certainly going to throw you for a loop in this calculation. Domain Authority uses an exponential base between 6 and 7. (meaning a DA of 50 is 6-7 times more powerful than a DA of 40) I'm sure there's some way to take that into your calculation, but I went to public school so it's beyond my simple math skills. 
But you might be on to something. My estimation is that a site with a DA of 40 is about 30x more powerful than a site with a DA of 23, all other things being equal.
This conversation reminds me of searching out your true competition, and search visibility. If you continue down this path, you may be interested in these topics.
Hi Jeremy,
Most of the time folks use inbred footer links for nefarious purposes, so Google generally tends to look down on these types of links and devalue them.
In your case however, it seems you actually have a use for them as they bring you a good amount of relevant traffic to your various sites.
But you have to be careful. Footer links generally carry very little value and cause more potential harm than good. And as you noted, sitewide anchor text (especially in the footer!) can get you in trouble in a Penguin kind of way.
The good news is 3 sites isn't a lot, and it's possible you haven't tripped any penalty filters.
I do like your idea about changing up the anchor text. Do you rank for those terms? If so, have you seen a drop or rise in the past year? It may be beneficial to experiment with images and less commercial anchor text.
Seems like your on the right track. Let us know how it goes!
Great question. I recently wrote an entire blog post about this very topic. Instead of listing all your options here, I think I'll just link to the post. http://www.seomoz.org/blog/google-personalized-search
By the way, sorry about the intermittent problems with the Rank Tracker. We're aware of some instability and our team is working on it.
Also - for non-localization, set your location to "United States". Not perfect, but it works pretty well.
Good luck!
Hi Alan, it's an odd response. The traffic numbers are pulled from Google Analytics. My belief was that GA is capitalization agnostic - meaning it doesn't matter if the keyword is capitalized or not. Let's see if we can get an SEOmoz engineer to weigh in.
It's likely such a large drop was accompanied by a loss of key links from one Linkscape update to the next. It's also possible the huge jump in number of links came from a relatively small number of sites. For example, a sidebar link from a blog that repeats across 100's of posts, archive, category and tag pages. These types of links pass little value.
Finally, there is some natural fluctuation of these scores from one index to the next. MozRank is comparative metric, so it's best not track your historic MozRank against yourself, but compare each update against your competitors.
Hopefully with the next update you see a rise in your mozRank!
500 errors could be caused by a mulitude of reasons, and for the non-technical they can be very hard to track down and fix.
The first thing I would look at is if it's a repeating problem in Google Webmasters Tools, or a one-time issue. These errors will show up in GWT for a long time - but if it's not a repeating problem it probably is nothing you need to worry about.
Wait, I assumed you found the problems in GWT, when you may have possibly found them using the SEOmoz crawl report. Either way, you should probably log into Google Webmaster Crawl Errors report and see if Google is experiencing the same problems.
Sometimes 500 errors are caused by over-aggressive robots and/or improperly configured servers that can't handle the load. In this case, a simple crawl delay directive in your robots.txt file may do the trick. It would look something like this:
User-agent: *
Crawl-delay: 5
This would request that robots wait at least 5 seconds between page requests. But note, this doesn't necessarly solve the problem of why your server was returning 500s in the first place.
You may need to consult your hosting provider for advice. For example, Bluehost has this excellent article on dealing with 500 errors from their servers: https://my.bluehost.com/cgi/help/594
Hope this helps! Best of luck with your SEO.
There are many, many different types of duplicate content, and how you handle it depends on the specific type of duplicate content and your needs.
If you haven't already, I highly suggest you read Dr. Pete's excellent post on dupe content here: http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
In your specific case it looks like you have multiple parameters serving the same basic content as your homepage. Is this correct?
In this case, you should set a canonical on every page pointing to the homepage. This also has the benefit of solving the errors in the SEOmoz PRO app.
It also sounds like you've addressed the issue in Google's Webmaster Tools. Unfortunately, Google doesn't let SEOmoz sync with Webmaster Tools, so anything you set there won't show up in the Web App.
Finally, don't forget about Bing Webmaster. They have similar parameter settings you can submit.
By the way, some SEOs would suggest putting meta robots "NOINDEX, FOLLOW" tags on those duplicate pages. While this may potentially send conflicting signals when coupled with the canonical tag, it is a potentially valid approach.
Hope this helps! Best of luck with your SEO.
Great question. For the most part, no - but there are exceptions.
The rankings in the PRO platform are designed to be geo-agnostic, meaning they don't take localized results into account. So if your keyword is "pizza" and your search engine was set to Google US, your rankings would reflect a general United States point of view, like this.
On the other hand, if your query contained local-specific words, like "best pizza in Chicago" Google will automatically return local results, like this.
While these aren't technically "places" results, they do bear some similarities, and these are the results you will find in your PRO ranking report.
One final note. The above only applies to PRO campaigns. The stand alone Rank Tracker tool uses less sophisticated technology and has been known to included weird local results from time-to-time. Look for an update to this tool soon.
The Keyword Difficulty gives you a best guess of how hard it would be to rank for a particular keyword, although it's not foolproof. For each keyword, the tool gives you an analysis on which you can base your best judgement based on
When using the Keyword Difficulty Tool, it's helpful to look not only at the raw number, but the metrics also contained in the Keyword Analysis report beneath it. Be sure to run a full report for important keywords. This will give you insight into exactly why each URL ranks the way it does.
For example, the keyword analysis will show you the number of inbound links for each page, on-page factors, and all relevant metrics. When you compare your own site against these factors (you can add your own URL to the report) you can judge how likely you are to rank.
Often times you can earn a top ranking on a keyword with a high keyword difficulty score if you have advantages that the others don't. This is particularly the case when the competition is poorly optimized on-page for the given keyword.
Hope this helps. Best of luck with your SEO.
Hello Prime85,
Both these answers are correct in their own way, but let me clarify and add my 2 cents.
1. 404s don't hurt your rankings directly, but they can provide a poor user experience.
2. If you keep URLs "live" - then Google can keep these URLs in their index indefinitely. This means search engines may waste time and crawling resources visiting pages you don't want in the index, while ignoring your other pages. This CAN hurt your SEO.
Long story short, (like Brian says) if the page is no longer relevant, you should remove it from the index or redirect it to another URL.
3. Returning a 404 kills all link juice that may have gone to the page, and it can also send confusing signals to search engines about the structure of your site if you have a bunch of pages returning 404s at the same time you have a bunch of new, but similar, pages popping into existence.
The best policy is to set up a 301 redirect from your outdated pages to the most relevant new pages. Don't redirect everything to a single page like the homepage, but instead the redirect to the page that would be most relevant and useful for the user.
On the other hand, if it's a low-value page and there's really no need to redirect it, you should remove it from the index. There's a couple ways to do this:
Hope this helps! Best of luck with your SEO.