To back up what Kyle says, yes there has be be relevance. The primary driver should always be does it add value and offer something that the visitor is looking for, rather than trying to shoehorn spuriously connected content onto a site. If it's not related in a positive way it will look manufactured, and probably be on a lower quality site. As soon as that happens you move across to join the penalty crew and take your chances.
Posts made by MickEdwards
-
RE: Guest Posting- Only Relevant Websites or Not?
-
RE: Something unintelligible in google search results.
What are you comparing to - if not Etsy what do you believe should be first position?
Etsy has a very high domain authority, a large link profile and social footprint. The keyword is in the URL, the meta title and in content so for me it is a perfect candidate for grabbing pole position. It also has a good volume of reviews.
-
RE: MOZ.com
Don't get disheartened. It can take a long time for Google to crawl the URLs your links are on. When you used disavow did you list the domain or each URL, as you are probably best removing the domain so you capture all links.
-
RE: MOZ.com
You cannot do this. If you have a poorly domain or homepage redirecting just injects the problem to the new location. You need to determine the problem links and deal with them by disavowing - http://moz.com/community/q/link-disavow
-
RE: How Long Until Manually Removed Anchor Text Links Stop Showing Up?
From a Google point of view crawling the pages you have/had links on can take hours/days/weeks/months depending on the authority of the domain and how accessible the URL is, so I imagine the moz crawler for example, will take similar times.
-
RE: Could someone be sabataging my site by bad backlinks?
If you are getting links like this I would disavow the domains responsible.
-
RE: Migrating EMD to brand name domain. Risk of Penguin Penalty?
It probably depends on the anchor. If the EMD was bluewidgets.com, then 'http://www.'bluewidgets.com' or 'bluewidgets.com' or 'bluewidgets' type anchor should be ok. If there was a high level of 'blue widgets' I would start to be a little concerned.
-
RE: How to rewrite/redirect a folder name with .htaccess
You might possibly have to detail the white-space explicitly with \s
RewriteEngine On
RewriteRule ^old\sfoldr/(.*) /newfolder/$1 [R=301,NC,L] -
RE: Robots.txt: how to exclude sub-directories correctly?
I've always stuck to Disallow and followed -
"This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:"
http://www.robotstxt.org/robotstxt.html
From https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt this seems contradictory
|
/*| equivalent to / | equivalent to / | Equivalent to "/" -- the trailing wildcard is ignored. |I think this post will be very useful for you - http://moz.com/community/q/allow-or-disallow-first-in-robots-txt
-
RE: Robots.txt: how to exclude sub-directories correctly?
As long as you dont have directories somewhere in /* that you want indexed then I think that will work. There is no allow so you don't need the first line just
disallow: /directory/*
You can test out here- https://support.google.com/webmasters/answer/156449?rd=1
-
RE: Can I make a universal 301 redirect rule for my site?
If you have mod_rewrite enabled on your server you could try something like this in your .htaccess file:
RewriteCond %{REQUEST_URI} ^(.).htm$
RewriteRule ^(.) /$1.php [R=301,L] -
RE: Why does it take so long to generate CSV reports from OSE?
I can answer that question from a slightly different perspective. I worked with software that parsed varying file types and csv files were so slow to process because of their size. It was the same scenario writing to them. So i'm not at all surprised at the slowness of exporting your data to them - not really Moz's fault.
-
RE: Buying keyword urls for local traffic
This should go a long way to answering your question - http://moz.com/community/q/what-are-the-good-strategies-using-satellite-sites-in-seo
-
RE: Threatening SEO practice
I would monitor the links in OSE. Any whiff of dubious links disavow the domain, any more disavow the new domain etc, etc. They will soon get tired of wasting their resources, if indeed they carry out their threat in the first place.
-
RE: Blog commenting for SEO - Useful practice?
I would suggest blog commenting has no real benefit for a link profile as it has been hugely abused.
However, there are very good reasons to do this with good authority, somehow related, spam free sites using quality comments.
- If no-follow they still give your link profile a natural mix
- Opportunity to develop relationship with the site
- Opportunity to write informative,authoritative comments to be seen as an authority
- Slight possibility for referral traffic
- still a small signal if part of a much bigger link strategy
-
RE: Exclude status codes in Screaming Frog
Took another look, also looked at documentation/online and don't see any way to exclude URLs from crawl based on response codes. As I see it you would only want to exclude on name or directory as response code is likely to be random throughout a site and impede a thorough crawl.
-
RE: Dropped ranking - new domain same IP????
Seems drastic. What caused the drop in rankings - penalty, algorithmic penalty, competition, content, poor link profile...?
-
RE: Existing content & 301 redirects
Yes a toxic domain will infect a new domain if 301 is implemented.
I would lean towards cleaning up the existing domain. Even if you end up disavowing every linking domain the existing domain is likely to have created more trust than starting from scratch. If it is a manual penalty ensure you document all steps to try and clean up so you can detail in the reconsideration request.