Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: White Hat / Black Hat SEO

Dig into white hat and black hat SEO trends.


  • Thank you both for the responses, really appreciate it.  From what I can tell in G analytics we're not receiving any benefit from these links whatsoever, so I'll contact them first before going the disavow route and ask them nicely to remove us.  Thanks again!

    | mjmorse
    0

  • I only had a quick look at your site, but I think recovery could be difficult.  Panda generally affects thin sites or sites with duplicate content.  I didn't check for duplicate content but it did look like there was a lot of thin content.  There are a number of pages that offer very little value to someone searching on Google.

    | MarieHaynes
    0

  • I worked with a similar company operating in the UK and Australia for a while, they didn't implement my advice and traffic plunged. I think the biggest issue you face is it's the same content on these pages (excerpts of profiles) as is on the individual lawyers page. Each page contributes no unique value to the consumer (searcher). There may be a whole host of canonical and sitemap challenges you are also facing.

    | peterlaurent
    0

  • Thanks Carson. I would tend to agree were it not for the fact that Tripadvisor is so adept at SEO. Not sure how to rationalize this behavior alongside their reputation. Assumed that I was missing something...

    | mario33
    0

  • Sucks, don't it? Google isn't perfect and there are still many flaws that let sites like this rise to the top. I've seen this happen many times (especially in the more competitive industries). This is the difference between black hat and white hat seo. White hat route: keep providing good material, get higher quality backlinks and eventually you'll overtake him (and the results will be more permanent). Fast & Risky Route: Looks like he just did a big social bookmarking blast to the sites. You can find these services around the web and order them for yourself (get a link report first and see if the sites match up). You'll be on even ground but the downside is the risk of going under next update. Best of luck, Oleg

    | OlegKorneitchouk
    0

  • Just seeing this post now.  Does anyone find it ironic that NYT drops a follow link to JCPenny in the article?

    | Harbor_Compliance
    0

  • There seems to be some confusion surrounding XML sitemap usage so I'd like to clear that up. Many people question the need for an XML Sitemap, and for frequently updating it.However, creating and submitting an accurate up-to-date and properly formatted XML Sitemap in a fundamental part on good on-site SEO and really pays off.Websites are never hurt or penalized in any way by using sitemaps. Even if you already have a XML sitemap, adding  a second will still help. In fact, if you look at Google's own robots.txt file ( http://www.google.com/robots.txt ) you can clearly see they are using multiple sitemaps themselves. Google recommends XML Sitemap submission in all of their Webmaster and SEO tutorials:  " Whether your site is old or new, we highly recommend  you submit an XML Sitemap" - Google Webmaster Tutorial In fact, recommending people NOT use a XML Sitemap in #4 on SEOmoz's list: "The Biggest SEO Mistakes SEOmoz Has Ever Made" written by a widely recognized SEO expert, the CEO of SEOMoz himself: http://www.seomoz.org/blog/whiteboard-friday-the-biggest-seo-mistakes-seomoz-has-ever-made Attracta is the largest supplier of XML sitemaps with over 2.8 million websites using them, and that seems to be what makes them have such a very high PR. Also, Attracta sitemaps are free, show you exactly when Google and other search engines access them, and can be updated and resubmitted at anytime.

    | InRealTime
    0

  • This is by far the most likely explanation as far as I can see. Thanks for checking it out Takeshi!!

    | AdoptionHelp
    0

  • "I like the concept that a directory is good only if you would still want the link if it passed no link juice." Yeah, I agree that this statement is pretty naive. You shouldn't expect to get much traffic from Yahoo or DMOZ. People use these directories for the link juice and the credibility boost, not traffic. Since both directories editorially approve their links, they still carry a bit of weight (although exactly how much is up for debate).

    | TakeshiYoung
    1

  • This is going to sound preachy, but I would be very slow to point fingers.  Perhaps there were multiple SEO companies involved.  If Branded3 was indeed their most recent SEO it doesn't mean that they were involved with the most gregarious spam.  (I'm not saying that they weren't - just saying that we don't know.) Also, I think you will find that the vast majority of effective SEO companies out there are using techniques, that, if reported, would not satisfy the Google guidelines.  I challenge you to rank a flower shop number one using ONLY white hat techniques.  At this point in time, with the level of effective spam that is out there, it would be very hard to do. In my opinion, whether or not we point fingers depends on what was discussed between the SEO company and Interflora.  It's possible that the agreement they had was that the company was to do all they can even if it meant bending some rules.  I've had some people come and ask me to rank their sites using only completely white hat methods and I've had others come and say, "I don't care what it takes - just get it done and we'll deal with the consequences if they come."  Sometimes the short term gains are worth it.  Plus, until recently, most websites were not aware that you could get deindexed for pushing boundaries too much.

    | MarieHaynes
    0

  • I have sucuri pluggin payed suscription. I will reactivated again. My web host is not 6 dolars. But is a shared one of 400 dolars. Actually they are good and thanks to them i could find the files on the server. What i cannot find is where is the gate. And if is there something on my computer or website Because the attacks starts and are directed to new created content pages. And less to old ones

    | maestrosonrisas
    0

  • Derek, PS you would get a much better rank and the site would look sharper if you swapped out YouTube for http://wistia.com/ videos they have S EO site map and you get 5 for free for life. All the best, Tom

    | BlueprintMarketing
    0
  • This topic is deleted!

    | Guest
    0

  • If it can be proven that the intention was to cause harm to another companies profits I would think you could be held liable. There is enough documentation on the web to show that Google penalizes for bad links and that negative SEO exists, if there is proof that you were doing what Google tells you not to do against your competition and it results in a penalty that Google says will happen, it seems like bad intentions can be proven and in that case you could be found guilty in a court of law. I am not aware of any precedents though.

    | irvingw
    0

  • You said  "If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a command like your robots.txt" This used to be an issue, but now nofollowing a link throws away the PR. It used to be that if you had 4 links on a page an nofollowed 2 of them all the PR gets pushed to the 2 links. Now if you nofollow two links, the PR still is divided by 4 and you're throwing away half of your PR. So Google could care less now what you are nofollowing internally. Nofollowing links to your pages is not enough to keep your pages from getting indexed, you need the noindex, follow meta tag on the page. Even robots.txt disallow sometimes isn't enough because Google could come in from a deep link and index the page. Why aren't you using URL parameter handling in WMT to tell Google to not index pages with the ?filter parameter in the URLs?

    | irvingw
    0

  • You may have missed my point.  Sucuri didn't fix my problem either, but when I hired Michael (see link in my response above) he had the expertise to fix it.  I agree with EGOL that in some cases you need to hire a pro. (I have no allegiance or connection to Michael other than the fact that he saved me so much headache after weeks of struggling with malware that other people couldn't fix.)

    | MarieHaynes
    0

  • I'm not a fan of doubling up, but only because it makes the results really hard to measure. If you implement both, you won't know which one worked, ultimately. I'm not sure it's actually harmful - it just can be hard to track. If you're just trying to prevent future problems (and don't have any immediate issues), I'd probably pick one and give it a few weeks.

    | Dr-Pete
    0

  • I've found OSE Firewall to be very, very good. Furthermore, have a look at Login Lockdown too.  Hasn't been updated for awhile, but still does the job.

    | TomRayner
    0