Questions
-
How to treat low-value and automated links during the link pruning process?
Hi there I often find that it is easier to take value out of the equation when looking to removing/disavowing links, and focus more on how that link got there. Your first example, the link may not be doing any good or harm and it might look out of place, but there's no doubt the link is a "natural" recommendation, in a sense that it hasn't been deliberately built by the company it links to. Such links wouldn't be a problem at all, in my opinion. For the second example, again while this looks like an odd link, I believe it is clear to see that a webmaster has not gone out and built that link deliberately. That is what Google is ultimately looking for. Automatic crawlers, particular statlog and others that will be well known to Google, won't need removing or disavowing - as I'm pretty sure the algorithm (plus manual reviewers) just discounts them anyway. If a link looks deliberately placed or produced by someone with a vested interest, automated or otherwise, that's where I think the alarm bells start ringing for quality checks etc. I believe it's why terms like "link earning" - getting people to link to you purely on the strength of the content and assets you produce - has proven to be so popular recently. Google will reward those kind of links all day. Your first example is an example of that, albeit a small one. Links like in your second example will just be discounted, but not actively penalised. It's interesting to consider that, once you see links as being "earned" or "natural", such as the client recommendation in the first example, this is where some SEOs have manipulated the algorithm and done so quite well. They replicate these "natural" looking links. Where most slip up is that they leave a footprint or visible clues that they're doing it deliberately. I don't advocate you do that at all - earn those links by being an awesome company - but it's always good to know how things can work and/or be exploited, as you can learn to avoid pitfalls that way. Hope this helps!
Intermediate & Advanced SEO | | TomRayner0 -
Large-Scale Penguin Cleanup - How to prioritize?
Your data sources are correct (AHREFs, Bing, Ose & Majestic) but I recommend including Bing as well. The data is free and you will find at least some links not shown in other sources. The link prioritization you shared is absolutely incorrect. "Priority 1: Clean up site-wide links with money-words; if possible keep a single-page link" While it is true site-wide links are commonly manipulative, removing the site wide link and keeping a single one does not necessarily make it less manipulative. You have only removed one of the elements which are often used to identify manipulative links. "Priority 2: Clean up or rename all money keyword links for money keywords in the top 10 anchor link name distribution" A manipulative link is still manipulative regardless of the anchor text used. Based in April 2012, Google used anchor text as a means to identify manipulative links. That was over 18 months ago and Google's link identification process has evolved substantially since that time. "Priority 3: Clean up no-brand sitewide links; if possible keep a single-page link" Same response as #1 & 2 "Priority 4: Clean up low-quality links (other niche or no link juice)" See below "Priority 5: Clean up multiple links from same IP C class" The IP address should not be given any consideration whatsoever. You are using a concept that had validity years ago and is completely outdated. bonegear.net IP address 66.7.211.83 vitopian.com IP address 64.37.49.163 There are no commonalities between the above two IP addresses, be it C block or otherwise, yet they are both hosted on the same server. You have identified the issue affecting your site (Step 1) and collected a solid list of your backlinks using multiple sources (Step 2). The backlink report is an excellent step which places you well above most site owners and SEOs in your situation. Step 3 - Identify links from every linking domain. a. Have an experienced, knowledgeable human visit each and every linking domain. Yes, that is a lot of work but it is what's necessary if you are going to accurately identify all of the manipulative links. Prior to beginning this step, be absolutely sure the person can accurately identify manipulative links with AT LEAST 95% accuracy, although 100% is strongly desired. b. Document the effort. I have had 3 clients who approached me with a Penguin issue, we confirmed there was not any manual action in place at the time we began the clean up process, but before we finished the sites incurred a manual penalty. Solid documentation of the clean up effort is required by Google in case the Penguin issue morphs into a manual penalty. Also, it just makes sense. You mentioned 50+ web properties so clearly others will be performing these tasks. c. Audit the effort. A wise former boss once stated "You must inspect what you expect". Unless you carefully audit the work, the process will fail. Evaluators will mis-identify links. You will lose some quality links and manipulative links will be missed as well. d. While you are on the site, capture manipulative site's e-mail address and contact forum URL (if any). This information is helpful to contact site owners to request link removal. Step 4 - Conduct a Webmaster Outreach Campaign. Each manipulative domain needs to be contacted in a comprehensive manner. In my experience, most SEOs and site owners do not put in the required level of effort. a. Send a professional request to the site's WHOIS e-mail address. b. After 3 business days if no response is received, send the same letter to the site's e-mail address found on the website. c. After another 3 business days, if no response is received submit the e-mail via the site's contact form. Take a screenshot of the submission on the site (not required for Penguin as no documentation is, but it is helpful for the process). All of the manipulative link penalties (Penguin and manual) I have worked with have been cleaned up manually. With that said, we use Rmoov to manage the Webmaster Outreach process. It sends and maintains a copy of every e-mail sent. It even has a place to add the Contact Form URL. A big time saver. If a site owner responds and removes the link, that's great. CHECK IT! If there are only a few links, manually confirm link removal. If there are many URLs, use Screaming Frog or another tool to confirm link removal. If a site owner refuses or requests money, you can often achieve link removal by having further respectful conversations. If a site owner does not respond, you can use "extra measures". Call the phone number listed in WHOIS. Send a physical letter to the WHOIS address. Reach out to them on social media sites. Is it a .com domain with missing WHOIS information? You can report them on INTERNIC. Is it a spammy wordpress.com or blogspot site? You can report that as well. When Matt Cutts introduced the Disavow Tool, he clearly said "...at the point where you have written to as many people as you can, multiple times, you have really tried hard to get in touch and you have only been able to get a fraction of those links down and there is still a small fraction of those links left, that's where you can use our Disavow Tool". The above process satisfies that requirement. In my experience, not much less than the above process meets that need. The overwhelming majority of those tackling these penalties try to perform the minimal amount of work possible, which is why forums are flooded with complaints about numerous attempts to remove manipulative link penalties and failing. Upon completion of the above, THEN upload a Disavow list of the links you could not remove after every reasonable human effort. In my experience you should have removed at least 20% of the linking DOMAINS (with rare exceptions). It can take up to 60 days thereafter, but if you truly cleaned up the links in a quality manner, then the Penguin issues should be fully resolved. The top factors in determining whether you succeed or fail are: 1. Your determination to follow the above process thoroughly 2. The experience, training and focus of your team You can resolve the issue in one round of effort and have the Penguin issue resolved within a few months....or you can be one of those site owners who thinks it is impossible and be struggling with the same issue a year later. If you are not 100% committed, RUN AWAY. By that I mean change domain names and start over. Good Luck. TLDR - Don't try to fool Google. Anchor text and site wide links are part of the MECHANISM used to identify manipulative links. Don't confuse the mechanism with the message. Google's clear message: EARN links, don't "build" links. Polishing up the old manipulative links is a complete waste of your time. AT BEST, you will enjoy limited success for a period of time until Google catches up. Many site owners and SEOs have already been there, and it is a painful process.
Intermediate & Advanced SEO | | RyanKent1 -
Who can advise in real-time delay until rich text snippets first appear?
At the start of the year they appeared within 24 hours. Now some do not appear at all such as those associated with ratings.
Link Building | | casper4340 -
Best linking practice for international domains
I think Google is discouraging site-wide links that are included to manipulate Google SERP results and are not for users... in your case they seems completely fine. I guess adding the link to other websites (for people who use language other than English) without the anchor text but on a flag will be a better idea. Link on a flag on the header or footer will give your visitor a clear idea of where to go if English is not the language he is looking for and as it will not have any anchor text so it will give no red alert to Google for manipulation!
Intermediate & Advanced SEO | | MoosaHemani0 -
Best Practice for Inter-Linking to CCTLD brand domains
Hi Thomas, I think that what you're doing right now is fine - "...linking from each content page to each other language domain, providing a link to the equivalent content in a separate language on a different CCTLD domain." Seems sensible from a user perspective - I think the only potential downside is if you're implementing this using lots of anchor text - this could potentially be problematic. Equally utilising javascript allow users to select language and location seems fine to me. I hope this helps, Hannah
Intermediate & Advanced SEO | | Hannah_Smith0 -
Why is there such a big discrepancy between OSE and GWT regarding # backlinks?
The OSE index is smaller than what Google reports in GWT, but then again, the links reported in GTW are famously inaccurate and often not up to date. Google often includes "junk" links that have very little chance of effecting their ranking algorithm. Linkscape crawls approximately the top 25% of web, which is where the majority of links are found that actually influence rankings. It's not a perfect system, and sometimes links are missed, but it works well as a predictive tool and also for competitive research. "It is interesting that the links that do show up in OSE a nearly exclusively sites that we own." Yes, it is interesting. The links Linkscape miss tend to be either beneath layers of navigation, or on pages with few inbound links. Sometimes improving your link structure will help your links both appear in Linkscape, and improve crawling and rankings from Google as well (although I'd be careful in your case, because you own the sites in question, not to create the appearance of a link scheme) Best of Luck!
Technical SEO Issues | | Cyrus-Shepard0 -
What is the best SEO URL design for keywords with a period?
As mentioned, the keyword is 1.FC Nuremberg which is naturally what users are typing into Google.
Technical SEO Issues | | tomypro0 -
SEO & Social | More SEO effect by liking the page URL rather than the Facebook Page?
Hi Thomas, I think this is a question that comes up often. One way to do it is to have both types on your page like we do on our blog. You'll notice the SEOmoz page "like" in the right nav bar, and the post "like" at the bottom of the post. If you're trying to figure out the best thing to do purely for SEO purposes, then I'd say you want them to like the actual URL of the content. I say that because Google now shows social sharing in the SERPs. So if for example you're logged in and do a search, if you have a connected friend on FB who has shared content on that topic it would show their face in the SERPs. Hope this helps!
Social Media | | jennita0 -
Bad neighborhood linking - anyone can share experience how significant it can impact rankings?
You could surely be penalized. If you really must link to them for some reason use nofollow at least
Intermediate & Advanced SEO | | mickey110 -
Skip root page for brandname domain and just forward to key-word URL document?
I've seen it done before for the first time about two weeks ago. I checked up on the rankings of the website, which was in a local (read: no competition) web development niche. The website was not ranking for anything. Take that as you will, it's certainly not a case study. Personally I wouldn't do it. What value does it bring your visitors? Any at all? If not, it's probably not a good idea, as per the advice Matt Cutts and the Google quality guidelines state: http://www.google.com/support/webmasters/bin/answer.py?answer=35769 Take a step back and think about it. Doesn't that seem excessively spammy?
Intermediate & Advanced SEO | | deltasystems0 -
Skip root page for brandname domain and just forward to key-word URL document?
Q&A is a little slow on the weekends, and really slow this Labor Day weekend, so it's taking a little longer to get responses than it might at other times. Hang in there, and I'm sure you'll get some comments. If not, ask the question again in a couple of weeks, in the middle of the week (and leave me a note on here and I'll delete this copy of the question).
Intermediate & Advanced SEO | | KeriMorgret0 -
Best strategy behind moving country subdirectory to dedicated TTLD wo/ loosing organic search volume?
Thanks for those references Ryan, I appreciate it. I will have a look tonight. /Thomas
Intermediate & Advanced SEO | | tomypro0 -
Multi-Word-Keyphrase in domain name wo/ or with dashes?
There is evidence that Google sees hyphens in the root domain as spam. A lot of lead generation publishers, unable to afford domains like www.freecreditreport.com will resort to buying spammy domains like www.free-credit-report.com. A lot of porn sites utilize this tactic as well. I would definitely steer clear of this as a tactic...most sites that I see using this almost always have manipulated link graphs, paid links, etc etc. Hope this helps. Be well!
Technical SEO Issues | | AnthonyYoung0