Getting spam Links pointing to our wrong url, what to do?
-
Hey Mozzers,
Looking in my Google Search Console (Webmaster Tools), I'm getting links pointing to bogus pages on my website that result in a 404. What does one do so you can tell Google that it has been "fixed"?
Do i just 301 it to another website?
If I add it to my disavow list, does Google remove the error in my webmaster tools?
Thank you!
-
I've seen this happen several times and the first thing to do is really clean out your site first. I'd recommend Sucuri.net to do a thorough malware cleanup. Their free plugin isn't enough, you need to go with the paid version. Are you on WordPress? I've seen this happen to 4 sites already I've helped.
Then, let all the errors 404. If you 301 them then the bad pages and links will stay alive longer. Even better to let them 410, but 404 should do the trick. I wrote about some of the reasons behind those in a post here on Moz a few years ago that still applies.
Matt Cutts has a good reply on disavowing spam links in his video here, and says:
"If you're at all stressed, if you're worried, if you're not able to sleep at night because you think Google might have something, or might see it, or we might get a spam report about you, or there might be some misunderstanding or an algorithm might rank your site lower, I would feel free to just go ahead and disavow those links as well,"
I went ahead and disavowed the DOMAINS of the most recent site, just to be extra sure. Let me know if you have follow up questions!
-
Joe,
Thanks for the update. Our website doesn't have any malware, but sucuri.net seems ideal to those who need it. Reading your article, I read:
- "Someone else from another site links to you but has a typo in their link"
In our case, these are crappy scraping sites that have bogus links pointing to us. Here's an example: Let's say you're nba.com and have a page on kobe bryant - nba.com/kobe-bryant
The link that Google is picking up from these crappy sites show a link to "nba.com/kobe-bry". If you notice, the link is incorrect and I don't want to see another 404 piling in webmaster tools.
What would you do?
-
Hi Shawn,
If these are from crappy scraper sites (which it looks like they are), then I'd add it to your disavow list, yes. I don't think that'll get Google to remove your error in GWT, but remember: GWT is a separate algorithm from the actual algorithm because Google doesn't want us to see how sophisticated they are. If GWT sees "errors" in links, but you've properly disavowed, I'm fairly confident the actual Google algorithm won't penalize you.
For anyone else finding this thread: make sure that you check the sites that are linking to your broken pages before disavowing! If they're good links but out of date or mistyped, just update them.

Best,
Kristina
-
Thank you Kristina. I went ahead and disavowed some of those links. Other bogus links are still showing in GWT, such as "blog/blog/"...we only have one /blog category but Google just weirdly finds more bogus ones. When I select "Linked from", it doesn't show any sources. So my additional questions are:
1. If I add "blog/blog" to my robots.txt, do I mark the /blog/blog links in GWT "mark as fixed"?
2. Does it matter if my robots.txt is long or short?
Thank you
-
Hey Shawn,
1. If GWT is showing a 404 without a "Linked from," I wouldn't worry about it too much. Sometimes Google tries to recrawl URLs it crawled once in the past, but that doesn't mean it's causing a problem. I would double check that you don't have any internal or external links to "blog/blog," then just "mark as fixed." There's no point to disallow Google from crawling a page that isn't there in the first place.
2. Nope. The only danger of having a long robots.txt is that you may get lost while checking your rules and accidentally disallow a page you want in the index. Whenever you add a rule to robots.txt, make sure you go to Google Search Console and use the robots.txt Tester to check if you've disallowed any important pages.
Good luck!
Kristina