Hi Alan, thanks for your questions. In the interest of respecting the Moz forums, perhaps it would be better if I let the support team handle it, they are great. As much as Id love to brag about the tool here, it just doesnt seem right. Ill see to it support gets back to you shortly.
Posts made by HiveDigitalInc
-
RE: LinkDetox Versus Removeem??
-
RE: LinkDetox Versus Removeem??
TAGFEE: My company owns Remove'em and I helped build the bad link detection algorithm for it.
I would say that Link Detox and Remove'em are very different. Link Detox, from my estimation, focuses very much on perfecting the bad link detection mechanism. In fact, many Remove'em users upload Link Detox lists to Remove'em just to use our outreach functionality.
Remove'em intends instead to be a tool that covers from "Discovery to Recovery". We don't disclose the same granular level of data on why a link is identified as concerning, instead our focus is sorting and flagging the links in a way that most efficiently allows you to do link removal outreach and build a disavow file. We get contact info for you, manage emails, track links for removal status and even build a progress page so that you csn show google your work so-to-speak in a reconsideration request. We deliberately have a bias towards action.
That being said, from all I have heard, link detox is am excellent product and, as mentioned, many of our customers use link detox as the scalpal, so to speak, in identifying bad links, and then upload them to removeem for managing the link removal process.
Hope that helps!
-
RE: Link Removal Services
First, let me begin with full disclosure. I am Russ Jones, CTO of Angular (formerly Virante) and creator of Remove'em. I will do my best not to make this an advertisement in any way and answer the question more generally about link removal services.
"My question is, has anyone used them, and did you see any results?"
The answer here is a definite yes. We should probably distinguish between link removal tools and link removal services. There are some tools out there that assist with the link removal process (Remove'em, LinkRisk, LinkDetox, and Remove'em etc.) and other tool/service blends that perform a degree of the link removal process (Remove'em Full Service, RMoov, etc.) and finally some service-only models (LinkDelete, etc.). I will say that I am personally unfamiliar with the service-only models so I will refrain from comment on them out of ignorance, not because they couldn't very well be useful and effective.
The blended services, by and large, appear to be effective. We have researched our competitors and I assure you if they were a sham, we would have made that clear in our marketing material by now

These services tend to provide the following improvements upon DIY...
1. Existing relationships with link purveyors to accelerate link removal.
2. Outreach technology that allows them to better find contact info.
3. Existing databases of contact information tailored to this industry.
4. Knowledge of Google expectations for successful reconsiderationHowever, you don't need any of this to run a successful recovery, the question is just a balance of your time and money. If your lost revenue from lack of rankings, cost of time/services (ie: employees or existing SEO firm running removal campaign) are greater than the costs of these services, they are well worth your consideration.
If you have an specific questions about these services beyond their efficacy, feel free to respond here - I would be more than willing to help.
-
RE: Microsoft SEO Toolkit vs MoZ
At Virante, we really like the IIS toolkit, but it has some drawbacks. The biggest drawback is that it doesn't take into account canonical tags when looking at things like duplicate content. It also makes assumptions that are simply overkill (like lumping nearly every redirect into being an unnecessary redirect). In the end, we actually use both.
Moz gives your regular crawl data over time, which is valuable to discern what might be the cause of an issue because you can correlate it with changes you made on the site between crawls. The timeseries data is all there in front of you.
IIS can give you some other data and, more importantly, give you the raw crawl data like the export of all internal links.
-
RE: Can 302 chains (affiliate links) from "toxic" sources hurt you? Or are you "shielded"?
We know that over time Google will consider a 302 redirect as permanent if it remains in place. The 302 redirect is default, so Google often is forced to determine whether a person intended to place a 302 or just did it out of laziness. We recommend that you go ahead and use robots.txt to block the affiliate URL parameters so that you don't ever have to worry about it at all.
Is the affiliate program on your site or are you using a 3rd party? If you are using a 3rd party, they might already block Google from crawling those URLs. An easy way to check this would be to follow the redirect chain and take note of the domains. Check if any use either robots.txt or X-Robots-Header to block Googlebot. You can also check your own links in GWT to see if they show up.
-
RE: Toxic Link Removal-Better to Pay an SEO Firm or Can I Do It Myself?
Hey, thanks for the positive remarks. Marie Haynes is a rockstar in her own regard though in terms of link removals, and one of the few people I would put up against my own team

Did want to give you a heads up though, we have some pretty big / awesome updates rolling out in Remove'em in the next 2-3 weeks so stay tuned.
-
RE: I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Just a quick question, do you see the URLs you "removed" still in the index? Or is it possible that Google has found a different set of 3000 URLs on your site?
-
RE: Content not being spidered
I see no problems. I just ran IIS Site Analysis Report and it had no real spidering issues. Everything was read correctly.
-
RE: Is Remove Em A Fantastic con?
1. We do use data from Majestic SEO, but you do have to use Open Apps to accomplish it. Majestic SEO's account is free for sites you own. If our customers are unable to get Open Apps to work, we work with them to get those links included. Did you put in a support ticket regarding this issue?
2. We do absolutely use AHrefs and SEOMoz. If you feel that links have been missed, please let us know. We can certainly see if there are any bugs.
3. Regarding recommendations of links to remove, that is a fair question. Having personally guided innumerable companies through reconsideration requests, I can tell you the last thing you want to do is cycle after cycle of rejected reconsideration requests because you have not removed enough links. Remove'em has actually one of the lowest thresholds compared to our competitors because we primarily look at anchor text. It is possible you are looking at the unfiltered list, but if that article does show up it is likely because the anchor text used in the link has high commercial value. It is our experience that these links need to be removed because Google will not give you the benefit of the doubt when you claim it is unnatural.
Perhaps most important, you have received a full refund for the purchase price of Remove 'em. If we were a scam, we wouldn't be willing to do that.
I am sorry that you won't be able to join the ranks of the over 1,000,000 links that have been removed via our service.
Thank you again for your time and good luck in your endeavors. At this point, out of respect for the SEOMoz community, I am no longer going to respond here regarding a product that is not SEOMoz's problem.
-
RE: Is Remove Em A Fantastic con?
Hi Steve,
I am Russ Jones, an active user here at SEOMoz and my company, Virante, owns Remove 'em. I would be happy to explain to you the issue you are discussing.
When you run the tool on the home page which makes an estimate on the number of links you have to remove, it is looking at the total number of links. This means that if you bought 1 sitewide link, and it was discovered by our predictor tool, it would count every link you found on the site.
However, when you run the real Remove'em tool, we find the unique linking domains specifically because you only need to contact the webmaster 1 time to get all the links removed. The sitewide link I mentioned above would only show up 1 time, rather than perhaps hundreds of time in the predictor.
We have had discussions internally about whether we should change the predictor tool to reflect unique linking domains, but unique linking domains reflects more on the number of webmasters you may need to contact and not necessarily the breadth of the penalty you face. We will certainly take this example into consideration in the future.
Thanks again for using Remove 'em!
-
RE: Does SEOmoz have a tool to find mirror sites?
Copyscape might help (http://www.copyscape.com)
-
RE: What to do?
It really depends. Sometimes duplicate content is created by what we call "URL Canonicalization Errors". This means that you can have multiple URLs that point to the same piece of content, thus creating the appearance of duplicate content. These problems can generally be solved quickly and easily with proper coding. Would you mind sharing the site? If not, you could always Private Message it to me.
We will get to the bottom of this

-
RE: SERPs recovery? When can I believe it?
Don't be afraid. Go ahead and change them. They should be citations only (ie: the name of your company) and there should only be 1. I think it would be OK for it to be in the footer, just like you would cite a source in a real academic paper.
You may very well see some temporary rankings decreases when you do this, but it is far safer in the long run. Don't let it come back to bite you when the next Penguin rolls out.
-
RE: Twitter for SEO, benefits beyond click-throughs?
Not unless those tweets are legitimate. If you are just buying tweets / retweets from a network, it is likely that your benefit will be little to nothing. If they are real tweets you get...
1. Traffic to your site, which can in-turn cause further sharing
2. The potential that those retweeting your links might share with someone who chooses to link to you via another method.
3. Whatever potential gain their is from social media metrics. -
RE: How should I react to my site being "attacked" by bad links?
I would proactively disavow those links and let Google know what is going on. Google needs to know that Penguin has created a market for malicious negative SEO attacks.
-
RE: Google authorship and multiple sites with multiple authors
I'm not sure personally, but you should reach out to either Mark Traphagen or AJ Kohn who are Google+ / Authorship experts.
-
RE: Links with Parameters
That is correct. However, I would consider fixing the code some other ways too...
1. Adding a canonical tag to the page.html if possible.
2. Suppressing the hpint_id=xyz if possible.
Other search engines won't be able to respect your wishes to ignore the ?hpint_id if all you have done is update GWT.
-
RE: Main Keyword penalized. And now?
I'd love to help but I would have to know the site and keyword, it is the only way. Would you be willing to present it either here or via private message?