Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: White Hat / Black Hat SEO

Dig into white hat and black hat SEO trends.


  • Andy, I think I would be somewhat concerned about this on two levels.  First, the duplicate content level--Google may see the content on the site search site as the original, and decide that much of your main site is unoriginal content, and thus hurt your rankings for your main site.  Second, this is going to generate a crapload of links to your site from one other site, and so you're running a risk of having a problem with Penguin like this. Having said all that, this should only be a problem if there are LINKS somewhere to the search results on the other site.  Google itself won't execute a search form. I'll recommend Google's CSE (custom search engine) as a replacement. It puts all the search results on your own site, it's pretty inexpensive, and simple to set up.  And very easy to configure to look like the rest of your website.  To see an example, try the search on my site http://www.visualitineraries.com.

    | MichaelC-15022
    0

  • Hello again, From what I know, If there is no manual action, then you don't have to worry about reconsideration request. You have to clean up the backlink profile by the next Penguin update. Try to contact each webmaster first. If they don't answer in a decent time span then use the disavow tool. If there are too many links, then disavow the their whole domain. This should simplify your work. Don't worry about nofollow links, they hurt only in some very extreme cases -> watch this video from Matt Cutts about nofollowed links. Good luck!

    | zoicaremus
    0

  • If you disavowed 80% of your links, and many of them were being counted by Google prior to the penalty, then you should expect your rankings to drop substantially, even after the penalty is removed....because you simply have a fraction of the links you had before.  Granted, the ones you disavowed were most likely very weak links, which is good. But moving forwards, what you've got to do is build new, strong, non-spammy links. And expect this to take some time, too.  From when you add a new link, it takes Google a while to discover that, then recalculate the PR that flows to your site from that link.  At one point, that link juice recalculation seemed to me to be taking about a month; I suspect it's a little faster now.  As well, I'm seeing indications with several clients' sites that there's an artificial delay of 3 to 6 months when major changes happen on a site...then, with no further changes made to the site, the rankings and traffic will inch up over a couple of months. So my advice:  be patient; you need to rebuild the links you've lost (with better quality sources, of course).  And if you can create interesting, shareable content, leverage Facebook, G+ etc. to get your site visitors to help you build some of those links (and get real traffic from the shares) in the social sites.

    | MichaelC-15022
    0

  • Another very helpful answer - thank you.  Moving forward, I still think the best approach is one website at the end of the day.  After all, the saying is that it is better to mine one mine deep than to mine several at the same time. In this particular niche, service industries like plumbing have exact match domains with less notable content.  We're still working on the ability to offset this advantage they appear to have. Thanks again!

    | AaronHenry
    1

  • The additional massive complexity, expense, upkeep and risk of trying to run a separate server just for bots is nowhere near worth it, in my opinion. (Don't forget, you'd also have to build a system to replicate the content between each server every time content/code is added or edited. That replication process could well use more resources than the bots do!) I'd say you'd be much better off using all those resources towards a more robust primary server and let it do it's job. In addition, as Lesley says, you can tune GoogleBot, and can actually schedule Bing's crawl times in their Webmaster Tools. Though for me, I'd want the search engine bots to get in and index my site just as soon as they were willing. Lastly, it's only a few minutes' work to source a ready-made blacklist of "bad bots" useragents that you can quickly insert into your htaccess file to completely block a significant number of the most wasteful and unnecessary bots. You will want to update such a blacklist every few months as the worst offenders regularly change useragents to avoid just such blacklisting. Does that make sense as an alternative? Paul

    | ThompsonPaul
    0

  • Hi Brad, my take on this would be to remove the site-wides, leave the link only on homepages and no-follow them. In the past and recently I have seen few instances where a web design firm left its link on the websites built by them. Later, some of those websites were never developed and left as it is with very thin content providing no value to the visitors and the Internet. In cases like these, you will be left with a low quality link pointing to your site. The other case would be what if those websites built by you will never get good content and moreover turn out to other shady businesses, you will have a big problem here too. Just not to take a chance, I would always recommend to no-follow your link on the homepages of websites that you build unless you are very sure about the credibility of the website or its owners. But as far as site-wides are concerned, big NO.NO. Hope that help my friend Best, Devanur Rafi

    | Devanur-Rafi
    0

  • I am currently doing an audit of a site that has used clicksubmit for several months as its main method of link building. I am not very deep into the link profile yet, but it doesn't look good. Almost all the same anchor texts, the linking sites look as though they were made specifically to host links... so far the client site doesn't have a single link that I would consider "good". I hate having to report that. I will check in with my findings later, but I have to agree with Michael York above - Great Reviews + Dodgy Looking Product + Sounds Too Good to Be True = A client website that will be looking for penalty remediation services soon.

    | Nick_Ker
    0

  • A similar question was asked last month that might give you some insight as well. http://moz.com/community/q/should-i-use-my-competitor-s-name-in-my-content-to-help-my-rankings

    | MikeRoberts
    0

  • Yes, link building will be extra (and often duplicate) effort. Adwords would be another concern. Other maintenance issues not so much. My goal was to target a specific visitor segment with a bonus of having the top keyword in the domain name, but recognise there are more benefits to keeping as one site. Thanks for your and Devanur's responses.

    | SteveMauldin
    0

  • Hi there, We believe that microsites are handy to have  In terms of linking root domains and definetly working well. But obviously not get to penalized by google Hiding the fact that microsites are not connected the with the main site makes sence(registered to differnent names). getting served with a different server provider.. Frutiko Team

    | FRUTIKO
    0

  • As long as you don't 301 penalized pages / websites to a non-penalized site, you shouldn't have anything to worry about. However, if you are sure your current site has pages penalized instead of a link penalty (which means that incoming links are not being counted, not that your site is penalized), then you shouldn't 301 anything if you are planning to "start over". There's been reports from people with penalized sites which moved to another domain and they were able to recover. I would only recommend such a move if you are now working only with your previous customers (existing before the penalty). But if you are still gaining search traffic and customers, then you should consider fixing the issues instead of moving. Just my 2 cents

    | FedeEinhorn
    0

  • I used the disavow tool to take some negative backlinks off a client's site. It was pretty easy just follow the Guidelines set by google to the letter. The client was very pissed and talking lawsuit and suing, but that is not something that I have any experience with. Apparently, it is a big deal and a federal crime because they talked to the FBI and the Feds actually listened. Once the disavow request is approved the links are still on your backlink report, but appear with a "no follow" tag next to them. There may have been no correlation because I was doing several SEO related things for them, but the rankings did jump right after the request was approved.

    | ZeroWing
    0

  • I am trying to put together a set of rules to help decide which links directory is harmful & warrants a removal. directory has xxx & gambling listing Directory is no longer in google database (site:domain.com shows no results). I take this as google penalty. Directory's home page & page where my link exists has low Mozrank, PR My link is site wide on the directory Suggestion & recommendations welcome.

    | VaiSam
    1

  • An article we wrote a while ago with a few examples. It's my site but answers your question: http://buildforsearch.com/googles-worst-penalties-and-what-happened/

    | Anti-Alex
    0

  • This might help: http://moz.com/blog/to-catch-a-spammer-uncovering-negative-seo

    | McTaggart
    0

  • Keep an eye on your GWT Links Report. If you start to see many fishy links pop up there, it might be safer to disavow them periodically. This sounds like a negative SEO attack and should be dealt with proactively. If you are ranking great without those new spammy links, it safer to keep removing them before G decides to slap you.

    | OlegKorneitchouk
    0

  • Hello, I understand clearly what you mean, and I can say I was on the other side. I was among the first ranking in a smaller country for the most popular "blog" related queries. The page was a part of a more general website, was very solid and solved most of the problems for users searching for those particular queries. Most of the people were clicking my search result and they were pretty satisfied by what they got there. The website was old school, layout old scool. I had amazingly attractive title and meta description, I basically nailed it. Then bigger brand names and huge websites were launched on the same queries. Solving way more problems and dealing with the matter in a new way. But I was still the first. Brands with pagerank 5 to 7 were competing with my page with no/or close to zero pagerank. I did not even have a fraction of their links and authority. I even laughed seeing that year after that I rank 1st above these big guys, they were on 2nd, 3rd and so on. A lot of years passed and I was still the first. It was really funny. And I tried to learn from it. Then I decided to refresh and modify the layout, because it was old school. I had some problems with internal linking and domain was down for a while. Then somebody hacked my server and I got some stuff injected there. I solved most of the problems, it was not easy. But when I got back I lost the top spot. There were a lot of changes, but the URL and the content of that particular page was exactly the same. So, from personal experience I can tell you that things can change. I  had the following: I was the first to cover that area a lot of users were clicking on my website - CTR from webmaster tools was amazingly high, and bounce rate was low And I can tell you that one of those ranking factors talked about a lot on seomoz - "User Usage and Traffic/Query Data" weigh way more than people think. At least from my experience. Anyway, try to ask yourself the following questions: -> Are the differences between your website and the old one significant? Do the users see them in the search listing and do they consider them of significant importance? If not, try to give them a 10th times better reason to click on your website, and also give them what they want (sometimes the bounce rate has something to say about this, but not always). It may look like Grandfathering, it's really hard to dismiss or confirm it. At first I thought about it the same way you do. However, first it would be very nice to answer honestly and from the user's point of view to the above questions. Good luck!

    | zoicaremus
    0

  • Find out if the content manager knows about keyword research. I do content management and have a graph on my wall that has two axes... The horizontal axis is "Profitable" The vertical axis is "Linkable" Content ideas are written on post-it notes and placed at an appropriate location on that chart.  The goal is to work on those in the upper right quadrant first... then generate more ideas for that same quadrant. After all upper right quadrant content is produced then pick the most fun jobs from the upper left and lower right quadrant. Generate more ideas for all quadrants but lower left.

    | EGOL
    0

  • I believe so. I did notice a small upward trend for long tails of product categories on our site that we had on MacRae's. I can't say for so, but it definitely did not hurt.

    | KevinBudzynski
    0

  • thanks Adam We came on board to do the on page SEO (as they hadn't done any) they had just been link building. We suggested they stop the link building as it was all going to the home page.  The anchor text link building was all going to the home page pretty much and some of the links were low quality. We've got them to stop all this dodgy link building and even undo some of it.  But because they haven't seen instant results with us (even though we have shown them the problems they have had) we are in focus now. I believe the structure of the website to be ok but would appreciate a outsider to check (can send you a link via linkedin if you are happy to take a quick look). I've pulled reports on each keyword and page again this afternoon and can see we now need to do some quality link building for the client which we will now suggest having pulled them out of the penguin/panda updates (thanks to their previous SEO company).

    | SocialB
    1