Yep! The sitewide link penalty, also commonly known as the footer link penalty or the "web design by" penalty is a pretty common (though not 100% universal) link dampener. Google mostly just ignores those links now, but they sometimes used to actively penalize for their presence (and may still in certain cases). My best advice is to instead link from your about page or another well-linked-to page on your site vs. linking from every page.
Posts made by randfish
-
RE: What is this exactly? Whiteboard Friday warning about footer links.
-
RE: Houston Company Needs Help (Will Our SEO Work Be Destroyed While Site is Down?, Can Anything be Done?)
You might want to check out https://moz.com/blog/how-to-handle-downtime-during-site-maintenance
Depending on the length of the expected downtime, a 503 is probably the way to go (and will tell Google it's a temporary issue).
Congrats on the rankings gains BTW! Nice work!
-
RE: How to improve PA of Shortened URLs
I'd highly doubt it. You're spending energy getting a redirect linked-to and indexed, rather than the URL you actually want to rank. Point those links and that effort to the page itself and you'll get a far better return.
-
RE: How to improve PA of Shortened URLs
The shortened URLs will pass any link equity they have to their target, but using a shortener to link to has no added benefit, and the 301 redirect will actually cost you some PageRank leakage according to Google.
-
RE: How to improve PA of Shortened URLs
Either we haven't crawled any links that point to #2, or the links we've seen to it don't pass any link equity (e.g. they're nofollowed or on pages with meta robots=nofollow, etc).
-
RE: How to improve PA of Shortened URLs
Hi Monu - shortened URLs generally aren't going to accrue much PA (or much link equity), because many (most) folks who link to them won't link to the shortener but to the URL it resolves to.
I'd also say that there's almost no circumstance I can imagine where it's actually useful or desirable to have a high PA score (or high ranking ability) for the shortened URL. You want the URL that actually resolves -- the one Google will show in its listings -- to get all the links. Shorteners could go out of business or stop redirecting properly or change from 301s to something else to track clicks, and then you'd lose that link equity to the final target. Thus, always better to have it go to the resolved URL.
-
RE: Keyword Explorer is Now Live; Ask Me Anything About It!
Hi Chris - yes! We know about this bug with the different tabs/windows and have a ticket in to the dev team to address. Thanks for the heads up.
-
RE: What is a Good Keyword Priority Score?
Hi Stayfirst - yes, you can certainly use the location-specific keyword as a jumping-off point. For example, I tried "Seattle Real Estate" and there's loads of good suggestions and a campaign I can build off the keywords that come out of that.
And yes - you're totally right that getting national rankings when you only serve one area could be a bad thing, as the engagement rate of those visitors would quickly tell Google you're probably not relevant for the majority of them. Way better to target the area you're aiming for (and vastly less difficult too).
-
RE: Mozscape API Updates (Non-updates!) - becoming a joke!
Hey Matt - I can get into some of the nitty gritty details on this.
Basically - we've been having trouble of all kinds with Mozscape, and while our team has indeed been working around the clock, the reality is that it's an old, clunky, hard-to-understand system that needs to be replaced entirely. That work is also going on, but as you might imagine, has a separate team on it, which means the Mozscape team's bandwidth is split.
Mozscape has crawling trouble - we've had issues with our own crawler design, specifically with spam that's fooled our crawlers (it's designed to fool Google, obviously, but has caught us, too), and biased our index. We also had an issue where some code was commented out that helped us recrawl important pages and other issues (along with a couple of longtime engineering departures) made that invisible to us for a good few months (even with it fixed, it will take an index or two to get back to normal). We've had other issues with hardware and bandwidth restrictions, with team changes, with unintentionally excluding important sites and important pages on sites due to erroneous changes on our end, with robots.txt interpretation mistakes. You name it. It's been pretty frustrating because it's never a single issue coming up again and again, but rather new issues each time. The team currently on the Mozscape project is relatively new -- we had almost complete turnover on that team in the last year (a combination of voluntary and non), so there's a lot of rampup and trying to understand what things do, and fix old problems, etc. I'm sure as an engineer you're familiar with those types of challenges, especially when the documentation isn't pristine.
IMO - those are crappy excuses. We should be better. We will be better. I don't provide them to pardon our shitty quality the last few months, but rather because you said you wanted detail, and I do love transparency.
I think we're going to have a tough slog until the new index system comes out (likely this Fall). I'm keeping my fingers crossed that we can repair each new problem and that few others arise, but the past 6 months have made me wary of overpromising and under-delivering.
BTW - it is true that the ML model means there's lots of DA flux as the goal is to be as accurate as possible with Google's changes, so if we see a site with certain types of inputs matching patterns of sites that don't rank as well, that DA will drop. Given that Google's rankings fluctuate all the time, that our crawlers fluctuate a lot (more than they should, as noted above), and that the link graph changes constantly, a lot of flux in DA is to be expected. That said, the new model will have DA refreshed daily, rather than monthly, and will also have history, as well as a way to dig in and see what inputs are big in DA and how those have changed. I think all of that will help make these shifts vastly more transparent, even if they continue to be high (which they should so long as Google's own flux is high).
One thing I am working on with the team - a different kind of score, called something like "domain visibility" or "rankings visibility" that tracks how visible a site's pages are in a large set of Google rankings. I think that score might be more what clients are seeking in terms of their overall performance in Google, vs. their performance in the link graph and how their links might be counted/correlated with higher/lower rankings.
-
What is a Good Keyword Priority Score?
Howdy gang,
This is my last discussion post in the series on keyword metrics in KW Explorer & Moz Pro (previously on Keyword Difficulty, Opportunity, & Volume). In this one, let's chat about the "Priority Score," a feature you'll find in Keyword Explorer on any lists you build.
Priority was conceived to help aggregate all the other metrics - Difficulty, Opportunity, Volume, and (if you choose to use it) Importance. We wanted to create an easy way to sort keywords so the cream would rise to the top -- cream in this case being keywords with low difficulty, high opportunity, strong volume, and high importance (again, if you choose to use it). Thus, when it comes to Priority Score, there's no particular number you should necessarily seek out, but higher is better.
When you get into the ranges of 80+ (which is quite rare, Single Malt Scotch is one of the few examples I could find, and only because it's volume is so high and there's only a couple SERP features), you're generally talking about keywords with high demand (lots of monthly searches), the difficulty isn't too crazy (a website in the 55-80 DA range might have a shot), and the CTR Opportunity is decently strong (usually not too many SERP features that take clicks and attention away from the organic web results). Below that score range, you're usually finding keywords where one or more of those isn't true -- there's either lower volume, heavier competition, or lots of SERP features with the accompanying lower estimated CTR.
When you're building KW lists, my view is that there's no "good" or "bad" Priority scores, only relative scores. Priority should be used to help you determine which terms and phrases to target first -- it's like a cheat code to unlock the low hanging fruit. If you build large lists of 50-100 or more keywords, Priority is a powerful and easy way to sort. It becomes even more useful if you use the Importance score to help add an estimation of value to you/your business/your client in to the mix. In that case, Importance can cut Priority by up to 2/3rds (if you set it at 1) or raise it by a little more than 3X (if you set it at 10). This is hyper-useful to nudge keywords with middling scores up if they're super-important to your marketing efforts.
Look forward to your feedback, and thanks for checking these out!
-
What is a Good Keyword Volume Score?
Hi All!
Continuing my series of discussions about the various keyword scores we use here at Moz (previously: Keyword Difficulty & Keyword Opportunity)... Let's move on to Volume.
Volume in Moz's tools is expressed in a range, e.g. Bartending Certification has volume of 201-500. These ranges correspond to data we have suggesting that in an average month, that keyword is searched for a minimum of X to a maximum of Y (where X-Y is the volume range). We use clickstream data as well as data from Google AdWords and then some PPC AdWords campaigns we run and have access to when we build the models for our volume data. As such, we've got very high confidence in these numbers -- 95%+ of the time, a given keyword's monthly search volume on Google will fall inside that range.
If you want to see all the nitty gritty details, check out Russ Jones post on Moz's Keyword Volume and how we calculate it.
As far as a "good" volume score -- higher is usually better, as it means more demand, but lots of keywords with low volume scores can also add up to strong traffic when combined, and they may be more relevant. Capturing exactly the audience you want that also wants you is what SEO is all about.
p.s. When Keyword Explorer or Moz Pro gives you a "no data" or "unknown" volume number, it may just mean we haven't collected information from our clickstream providers or AdWords crawls, not that the keyword has no volume (though it sometimes means that, too, we just don't know yet). One way to verify - see if Google Suggest autofills it in when you type in the search box. If it does, that's usually a sign there's at least some volume (even if it's only a few searches a month).
-
What is a Good Keyword Organic CTR Score?
Hi Folks! You might have seen my discussion on What Is a Good Keyword Difficulty Score, and this is a continuation of the same vein. Keyword Organic CTR is probably my favorite score we developed in Keyword Explorer and Moz Pro. It looks at the SERP features that appear in a set of results (e.g. an image block, AdWords ads, a featured snippet, or knowledge graph) and then calculates, using CTRs we built off our partnership with Jumpshot's clickstream data, what percent of searchers are likely to click on the organic, web results.
For example, in a search query like Nuoc Cham Ingredients, you've got a featured snippet and then a "People Also Ask" feature above the web results, and thus, Keyword Explorer is giving me an Organic CTR Score of 64. This translates directly to an estimated 64% click-through rate to the web results.
Compare that to a search query like Fabric Printed Off Grain, where there's a single SERP feature - just the "People Also Ask" box, and it's between the 6th and 7th result. In this case, Keyword Explorer shows an Organic CTR Score of 94, because we estimate that those PAAs are only taking 6% of the available clicks.
There are two smart ways you should be using Organic CTR Score:
- As a way to modify the estimated volume and estimated value of ranking in the web results for a given keyword term/phrase (KW Explorer does this for you if you use the "Lists" and sort based on Potential, which factors in all the other scores, including volume, difficulty, and organic CTR)
- As a way to identify SEO opportunities outside the normal, organic web results in other SERP features (e.g. in the Nuoc Cham Ingredients SERPs, there's serious opportunity to take over that featured snippet and get some great traffic)
OK, so all that said, what's actually a "good" Organic CTR score? Well... If you're doing classic, 10-blue-links style SEO only, 100 is what you want. But, if you're optimizing for SERP features, and you appear in a featured snippet or the image block or top stories or any of those others, you'd probably be very happy to find that CTR was going to those non-web-results sections, and scores in the 40s or 50s would be great (so long as you appear in the right features).
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
Either can work well - subfolders are a fine solution for many. There's a slight bit more advantage in country-targeting with TLD extensions, but you lose a lot in link equity and ranking ability, so it's a tradeoff you'll have to choose between.
-
RE: Is everybody seeing DA/PA-drops after last MOZ-api update?
Hi Niels - yep, I saw a bit of this too. I believe there's two causes:
-
We crawled a larger swath of the web in this index, so we captured more sites and more links, and that may mean the scaling of PA/DA (which are logarithmic) stretches to accommodate the larger number of links found, especially to sites at the top of the scale. For example, if Facebook has a DA of 100 with 5 Billion links, then we find 5 billion more links to it, Facebook still has a DA of 100, but it's a much higher threshold. Thus, sites with fewer links (and less quality links) will fall in DA as the scale is now stretched.
-
We crawled some weird stuff in this index, by mistake (or rather, because spammers built some nasty, deep crawl holes that Google probably didn't fall for but we did). A ton of odd domains on strange ccTLDs were seen, and crawled, because they gamed PageRank with lots of sketchy links. We've now excluded these for indices going forward, and hopefully will see the impact abate.
All that said, over time, as our index grows, you can expect that raw DA/PA numbers could get harder to achieve, meaning a lot of sites will drop in PA/DA (and some will grow too, as we discover more links to them in the broader web). My best advice is always to not use PA/DA as absolutes, but rather relative scores. That's how they're designed and how they work best.
It's like back when Google had PageRank, and Moz.com grew from PR4 to PR7, then as Google got bigger and bigger, and the web got bigger, Moz.com fell to PR5, even though we had way more links and ranked for way more stuff. The raw PR scale had just become stretched, so our PageRank fell, even though we'd been improving.
-
-
RE: What to believe Google's Comp or Moz's Opportunity/Potential graph
Hi Papa -- Google's number is for paid search. They're just telling you how much competition there is for the paid/AdWords results. Moz's KW Explorer tells you difficulty for the organic (non-paid) results. They're two entirely different things! More here: https://moz.com/help/guides/keyword-explorer
-
RE: Keywordplanner-esque tool for large amounts of keywords (>1000)
Given the needs, I suspect you'll want to talk to one of the clickstream data providers about a custom deal. Jumpshot and SimilarWeb are the two I'd look at. Just be prepared to pay a lot, as their data is proprietary and they can thus charge quite a bit.
-
RE: Bad Dates in SERPs, YouTube & Rankings (Nov. 10-18)
Thanks for posting this Pete. I've been hearing from folks about these issues, too, and obviously a lot of questions and speculation is flying around Twitter.
I wonder if Google thinks the rankings are unconnected to the date issues, but searcher behavior or some other indirect attribute is having this commonly (but not universally) observable impact on sites employing the embeds/dates... Wouldn't be the first time (e.g. subdomains, rich snippets, CTR, etc. supposedly not having "direct impact" but then having obvious observable/testable impact).
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
Hi Shop-Sq,
A) I think merging the domains is almost definitely the right move, so long as you do it right (get the redirects nailed, don't have any performance issues, update the links correctly, etc).
B) I don't believe there's any penalty coming for shopping sites that also happen to host forum content or blog content. The only risk is if a lot of the site becomes (or is) cruft, meaning low-engagement/low-value content Google doesn't want in its index or searchers never click on/stay on.
C) We have not seen what your developers describe. Some forums haven't done well, others have done quite well (e.g. Moz's Q+A has benefited a lot over the last few years). I don't believe any hiccups you encounter will be because the types of content are unique, but rather because of technical issues, missteps, or content that doesn't help searchers and doesn't perform in Google (and all three should be avoidable).
D) Yes. We/I still recommend subfolders over subdomains. More strongly than ever actually. We've got some new evidence that Google judges content on a subfolder level, hence subdomains may not inherit all the rankings abilities of subfolders/other subdomains on the same domain.
-
RE: Best and easiest Google Depersonalization method
I'm surprised at how well this still works, but it does:
- Use an incognito browser window to remove account personalization
- Use a query string like this: https://google.co.nz/search?q=your+keyword+terms&gl=us
With 2) above, you're removing the geographic bias of any particular region/IP address by searching in Google New Zealand, then re-geo-locating the search to the US. This will give you non-geo-biased results.
If you want to see how specific results look from a particular region, there's two semi decent options:
A) Use Google's Ad Preview Tool: https://adwords.google.com/apt/anon/AdPreview?__u=1000000000&__c=1000000000
B) Use the &near parameter, e.g. https://google.co.nz/search?q=your+keyword+terms&gl=us&near=seattle+wa