Questions
-
301 vs 410 for subdirectory that was moved to a new domain, 2-years later
Google is adding and removing URLs from its index fairly slowly right now, and it's not uncommon for changes to take several weeks to filter up into the index, especially for site: searches. This is very annoying (even more so for people who are trying to launch brand-new sites), but not a huge deal since, to Laura's point, these URLs are most likely not showing up for any searches, they just haven't filtered out of the index. I would give it another week or two and see what happens. You may also want to do a Fetch+Submit in Search Console for a few of the subdirectory URLs, to make sure that Google revisits them and registers that they are 410s now - if they've been redirecting for 2 years, Google may just not be crawling them that frequently.
Intermediate & Advanced SEO | | RuthBurrReedy0 -
Resubmitting disavow file after penalty removal
Been through the disavow on my site- While missing out on a few possible links, sucks.....are they high enough quality to warrant the possible negative consequences of lost revenue from the penalty coming back? Personally speaking unless that's a link from the Wall Street Journal or CNN, I wouldn't be poking the sleeping bear, it's just not worth it.
White Hat / Black Hat SEO | | MarkAse0 -
Is Moz Domain Authority still relvant when it comes to Google ranking?
In our last Ranking Factors Correlation study, Domain Authority had a 0.29 correlation, which is one of the stronger correlations we measure. The strongest metric, Page Authority, showed a .39 correlation. The idea of Domain Authority is that comparing 2 websites... all other things being equal... one is more likely to rank than the other. A significant portion of the weighting factors in Domain Authority is links. As Google shifts the significance of links, this changes. But as you've observed, in the real world things are rarely equal. Moz is also working on spam scoring to help us better discount links that Google may likely be ignoring. Interestingly, with all the talks about links and Google, what's surprising is how important links still are. Metrics like DA and PA haven't changed that much in their ability to gauge ranking potential. Perhaps someday we'll need new signals. For now they are still reliably useful.
Search Engine Trends | | Cyrus-Shepard0 -
Using Canonical on home page
Hi Casey I have a "self-referring" (pointing to itself) canonical on all of my pages, if another isn't needed. There's certainly no harm in doing so and I believe that there is a benefit to. If your content is put up, is indexed first by Google and has the canonical tag on it - that is a really clear directive to Google/Bing that you are the originator for your content. That means if someone comes along and steals/scrapes the content from your site, they are far less likely (in my opinion) to be successful in passing it off as their own (and therefore leaving you with a duplicate content issue). In addition, having self-referring canonicals on pages future proofs you against any plugins/searches/features of your site that might generate multiple URLs from an original (think www.domain.com?searchquery.html if you have a search function on your site). Having a canonical will mean that Google will not index multiple versions of your URL, which might also result in duplicate content. This also prevents attacks from people who try to brute force multiple versions of your URL to get indexed (ie sending spam links to www.domain.com?randomquery - it's very rare but I've seen it happen) In short, I think self-referring canonicals are great idea - use them wherever you can.
Intermediate & Advanced SEO | | TomRayner0 -
Mobile and SEO
Hi Casey, It sounds like you're using dynamic serving, rather than pure responsive design. The difference is that, while both approaches keep a single URL for mobile and desktop users, a responsive approach also keeps the same HTML and simply uses CSS to rearrange content on the page to fit a smaller screen; dynamic serving allows you to actually serve different HTML based on user agent. The main thing to do for dynamic serving is to make sure you use a Vary HTTP User Agent header, to indicate to Google that you are not cloaking but rather are serving different content based on user agent to provide a mobile-friendly experience. You can find more info on this here: https://developers.google.com/webmasters/smartphone-sites/details
Web Design | | bridget.randolph0 -
Google Analytics Page Value - Help please
Casey, here is the exact quote from Google Analytics' blog on the Page Value topic: Now there are a couple of things to be aware of. The calculation does not include all transaction and goal revenue for the entire visit. It’s only the goal conversions and transactions that happen after the page is viewed, not before the page is viewed. Source: http://analytics.blogspot.ro/2012/07/understanding-and-using-page-value.html So in your case the number of Unique Page Views you have, 2345 - also includes Page Views that happened after the goal was completed, and these are not taken into consideration when calculating the Page Value. Let me give you an example: someone lands on your website on an internal page, completes the Goal without going to your homepage and after the goal is completed he is redirected to the homepage. The visit to the homepage is taken into consideration for the "Unique Page Views" metrics, but is not taken into consideration in the unique page views used when calculating the Page Value.
Online Marketing Tools | | SorinaDascalu0 -
Linking and non-linking root domains
Casey - Let me take a quick stab at your question. I think that you are asking what affect a non-followed link vs. a followed link has on SEO rankings for a site? If so, then the answer is that generally, nofollow links coming into your site don't help your SEO efforts. That said, the real answer is a little murky, and I'll take a stab at explaining it below: According to Moz's Open Site Explorer, 2.97% of all links they found were nofollowed, out of 106 billion URLs and 150 million root domains. http://www.opensiteexplorer.org All of the links from Moz.com's QA section are no-followed, as an FYI. Wikipedia's external links are no-followed as well… and this was done as a means to reduce abuse of the system and prevent people from using WikiPedia as one giant inbound link source. According to the nofollow WikiPedia entry: http://en.wikipedia.org/wiki/Nofollow … Nofollow links were originally suggested to stop comment spam in blogs, and in early 2005 Google's Matt Cutts and Blogger's Jason Shellen proposed the no-follow value to address the problem. Generally speaking, no followed links don't help your site from an SEO perspective. That said, Google has left it a bit open with their answer on how they handle no followed links: https://support.google.com/webmasters/answer/96569?hl=en How does Google handle nofollowed links? In general, we don't follow them. This means that Google does not transfer PageRank or anchor text across these links. Essentially, using nofollow causes us to drop the target links from our overall graph of the web. However, the target pages may still appear in our index if other sites link to them without using nofollow, or if the URLs are submitted to Google in a Sitemap. Also, it's important to note that other search engines may handle nofollow in slightly different ways. Here's what Matt Cutts of Google says about nofollow links: http://www.mattcutts.com/blog/pagerank-sculpting/ "Nofollow links definitely don’t pass PageRank. Over the years, I’ve seen a few corner cases where a nofollow link did pass anchortext, normally due to bugs in indexing that we then fixed. The essential thing you need to know is that nofollow links don’t help sites rank higher in Google’s search results." It's possible that no followed links can actually hurt you, if you have too many of them, and you're trying to use followed vs. no-follow links within an internal link structure of your site to "Page Rank Sculpt" pages in your site: According to Wikipedia: On June 15, 2009, Matt Cutts, a well-known software engineer of Google, announced on his blog that GoogleBot will no longer treat nofollowed links in the same way, in order to prevent webmasters from using nofollow for PageRank sculpting. As a result of this change the usage of nofollow leads to evaporation of pagerank of outgoing normal links as they started counting total links while calculating page rank. The new system divides page rank by total number of out going links irrespective of nofollow or follow links, but passes the page rank only through follow or normal links. Matt Cutts explains that if a page has 5 normal links and 5 nofollow out going links, the page rank will be divided by 10 links and one share is passed by 5 normal links. http://www.mattcutts.com/blog/pagerank-sculpting/ Back to Wikipedia, though… Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but it does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page). I hope this helps! -- Jeff
Intermediate & Advanced SEO | | customerparadigm.com0 -
Google Manual Penalty - Unnatural Links
You'll get way more than 1000 links from WMT. It says you get links from the top 1000 domains but it's usually quite a bit more. I have had some spreadsheets from WMT that contain over 30,000 links. John Mueller (Google employee) once said that WMT was all you needed in order to remove a penalty. So, for over a year that's all I used. However, in the last couple of months since Google has started to give examples of unnatural links when a site fails, many of those examples are not in WMT! I think what John meant was that you could see the patterns in WMT such as articles, directories, etc. and then use those patterns to find additional links. But, if you don't have a list of all of the links that you have made then that is not going to work for most people. So, now, what I do is combine the links from WMT, ahrefs.com, majestic SEO and Open Site Explorer. It's a pain. I've written a script that combines them for me but it still takes me quite a while to put them together. With that being said, I have removed penalties from some sites by just using the WMT links. And then I've had other ones that Google has been really nitpicky on and has not removed the penalty until we addressed every single self made link.
Intermediate & Advanced SEO | | MarieHaynes0 -
Moz Rank and how to do better?
The existing links could themselves lose MozRank, which would mean a lower MozRank score to the site that they linked to. Most sites will decrease in MozRank over time in the absence of new links being built.
Intermediate & Advanced SEO | | TakeshiYoung0