Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: White Hat / Black Hat SEO

Dig into white hat and black hat SEO trends.


  • You guys need to talk to Ryan Kent. He is the master of link penalty removal. I sent him clients before. Here is his info: http://moz.com/community/users/312503 Notice he is #2 in all MOZ? Go with him.

    | Francisco_Meza
    0

  • Hi, Saijo and Bradley are right in saying that hiding elements on a smaller screen should not be an issue (as it's a correct implementation of responsive design). Bear in mind as well that there is a Googlebot and a Smartphone Googlebot, so as long as the Googlebot is seeing what desktop users see and the Smartphone Googlebot (which uses an iPhone5 user agent) is seeing what mobile users see, it shouldn't be a problem. The only thing I would add: If you are going to use display:none to prevent a user from seeing something when they view your site, it's good to include an option to 'view full site' or 'view desktop site'. Also in that case I would question whether you actually need that content on the desktop site at all? Because best practice is to provide all the same content regardless of device. If it's hidden but still accessible to the mobile user (in a collapsible div for instance) there's no cloaking involved so it shouldn't cause a problem. As a side note: the Vary HTTP header is really for a dynamically served website (that is, a single URL which checks user agent and then serves the desktop HTML to desktop devices and mobile HTML to mobile devices). Hope that helps!

    | bridget.randolph
    0

  • Wow - I have received an email like that recently, and was surprised that the person emailing me would want to take his link down - but now it all makes sense.  It was a competitor all along! Talk about Negative/black-hat tactics!

    | JMacSupply
    1

  • Just saying what we did... We had a site that was hit by panda.  We had lots of very short news blurbs and some republished content from government agencies and academic institutions - much of that done by their request for exposure to our visitors.   Immediately after the hit, we noindex/followed or deleted/redirected the republished content.  We also noindex/followed or deleted all of the short content.  The site got out of panda a few weeks later.  Some traffic loss but  not substantial As for improving short content.  We have done a lot of that.  We had lots of very short descriptions of two sentences plus one or two images that were getting nice amounts of traffic.  We improved those to a few hundred words and two or three images (very time consuming, very expensive - a few hours per page.  The rankings for short tail queries went up nicely and there was a huge increase in long tail traffic.  We later started improving the few hundred words plus two or three images to one to two thousand words plus four to eight images - even more time consuming - a day or two per page.  Again, rankings and traffic go up nicely. Today, for each new article that I publish, I am making a huge improvement to a page that is a proven traffic getter but could be improved a lot. For you, take a look at the traffic into those 2700 old articles prior to your panda problem.  Some might not be worth much, but others might be golden.  Then decide what to delete/redirect, what to noindex/follow, and what to improve.  Then begin working. Good luck.

    | EGOL
    0

  • Paid links are a tricky area, and there are a lot of loopholes. If a company is straight up selling you a link for money, just to manipulate Google's rankings, then that's a definite no-no. However, if you are paying for a service Martindale-Hubbard that also happens to include a link, that could be seen as ok. Many directories also get around this by charging you a "review fee" and not guaranteeing inclusion, therefore making the payment about the service rather than the actual link. A good rule of thumb when evaluating links is to ask yourself "Would I still want this link, even if it had no impact on Google?" if the answer is yes, then it's probably a good link. Also, evaluate the site to make sure it is high quality and in Google's graces, i.e. does it have pagerank, are its pages indexed, do they link to spammy sites or only quality ones, etc.

    | TakeshiYoung
    0

  • Completely agree, it will be a lot of work if you have a high number of pages but definitely worth the effort in the long run!

    | stever999
    0

  • Oh, they'lre still indexed - got it. Yeah, that's a lot tougher. Ultimately, Google has to re-crawl these URLs, and since they're bad URLs and have no internal links and only spammy inbound links, that can take a while. You can remove the URLs in Google Webmaster Tools, but that's a one-by-one process, so it's mostly for the worst culprits. Another option would be to make an XML sitemap with just these bad URLs. Encourage Google to recrawl them and process the 404s. The sitemap would also help tell you how many of the URLs were indexed and to track that number (more reliably than "site:" will). Unfortunately, you may have to make that list manually.

    | Dr-Pete
    0

  • I guess part of the reason is that local search results do not show up for some of your keywords, as in "SeaTac Bankruptcy Lawyer", and so organic results are the only option. Yes you are very right some of this will not work for specific industry or unique geographic areas. Thanks again for all your input.

    | vmialik
    1

  • We are seeing this every day for lots of searches in the UK. I did some reading up on "domain clustering" and found that Google recently reverted its algo regarding how many results it displays for a particular search term. It used to be no more than four, then it changed to 7, which is an older practice. Compare this with competition from Amazon, Ebay, the .com and .nz results for transactional searches, and the bias towards brands and most small businesses in the UK dont stand a chance of competing in organic search anymore. I for one know of several small businesses that are down 70% because of this, dropping from position 1-3 to position 8-12. Hence the PPC conspiracy theory. "Google did it on purpose to push us all into using PPC".

    | Silkstream
    0

  • Hi,There are lots of options to do this. Building your own redirect within the site is the best way to do this. The best option is to build a redirect file in c#/vb and search using RegEx for the old URLs and redirect to the new URLs. http://forums.asp.net/t/1844542.aspx Old post but sill a useful resource. Or you could add all the old URLs one by one and redirect to the new URLs if you want to avoid Reg Ex's. You could also try using the WebConfig if the above doesn't work. <httpredirect enabled="true" exactdestination="true" httpresponsestatus="Permanent"><add wildcard="OLD URL HERE" destination="NEW URL HERE"></add></httpredirect>

    | CityWonders
    0

  • Sometimes you need to leave the crawl path open to Googlebot so they can get around the site. A specific example that may be relevant to you is in pagination. If you have 100 products and are only showing 10 on the first page Google will not be able to reach the other 90 product pages as easily if you block paginated pages in the robots.txt. Better options in such a case might be a robots noindex,follow meta tag, rel next/prev tags, or a "view all" canonical page. If these pages aren't important to the crawlability of the site, such as internal search results, you could block them in the robots.txt file with little or no issues, and it would help to get them out of the index. If they aren't useful for spiders or users, or anything else, then yes you can and should probably let them 404, rather than blocking. Yes, I do like to leave the blocked or removed URLs in the sitemap for just a little while to ensure Googlebog revisits them and sees the noindex tag, 404 error code, 301 redirect, or whatever it is they need to see in order to update their index. They'll get there on their own eventually, but I find it faster to send them to the pages myself. Once Googlebot visits these URls and updates their index you should remove them from your sitemaps.

    | Everett
    0

  • I wouldn't obsess over DA/PA. People lose sight of the fact those are just Moz's metrics. They're an excellent resource to have, but take them with a grain of salt. If a website is a good resource for you that could potentially send referral traffic, but the DA/PA isn't where you want it to be, go for it anyway. We all know this, but ultimately, the purpose of online marketing is to increase traffic and conversions (traffic being secondary to conversions). If you're forgoing either one of those two for the sake of preserving your domain authority or link metrics, you're making a mistake.

    | garfield_disliker
    0

  • you may find this discussion helpful: Moz discussion on footer links The opinion is fairly divided on getting links on client sites. The best bet would be to try it out with some of your client sites and analyze the results for yourself. If you experience a drop in your rankings, get rid / no follow the links.

    | SEO5Team
    0

  • Good answer We are currently working on a site audit to correct the on-page issues you mention. That said, the volume of traffic they're getting still seems to be unusual v's major and similar sized competitors in their field. Their on-page keyword optimization is below par compared to most. They have virtually no apparent Content Strategy or Social Media presence. This means they're not following Google's guidelines for generating Organic Search Traffic. They are certainly not one of the major players in the world of online accounting so all things considered, the search traffic they're getting seems to be out of the ordinary

    | PeterConnor
    0

  • Hi Thanks for the info. Is it correct then to assume, acknowledging all the cautions given, that it would be most advantageous to just target one pic per gallery to optimize, instead of trying to optimize each pic in each gallery, and spread my different key word phrase descriptions among one picture per each gallery? THanks Galen

    | Tetruss
    0

  • Good to hear, and yeah pretty pleased they've nearly all come up as nofollow. Thanks for the responses!

    | originenergy
    0

  • No, you will not receive any increase in your pagerank as a result. Having said that, if the other website did NOT include the canonical link then there is a chance the link juice for the page would either be split equally between your site and their site or worse case it will all be given to their site (if Google thinks that they are the originator)! So indirectly, ensuring that they add the canonical tag will result in your page having a better ranking. Hope that makes sense! Steve

    | stever999
    0

  • Having duplicate content isn't an issue, so much as having enough unique content for each page to be seen as valuable on their own. A single template paragraph probably isn't enough, but if you can include other information such as address, driving directions, phone number, photos of the facility, class sizes, school hours, etc. that should be enough unique content for each location. You can even make the schedule an image or iframe if the duplicate content issue is a concern. Or if the schedule is identical for every location anyway, create a single schedule page, and link to it from each of the locations.

    | TakeshiYoung
    1

  • If what you've got right now is working for you and bringing in relevant (converting) traffic then I would be cautious about doing anything too drastic. There's always a risk associated with any changes you make like this and the last thing you want to do is kill your own traffic. I wouldn't immediately tear down the duplicate pages, but I would start to think about how I could update some of the content and maybe create new pages that better engage with your visitors and help to increase your conversion rate (I don't know what your conversion rate is.). That may help off set any impact cause by a potential loss of rankings for those duplicate pages might.If the pages continue to rank then it'll still help! I've got some thoughts that might be useful (please take this as constructive criticism and recognise that I don't know your niche as well as you do!) For example, the copy on your home page is "all about you" and very little about what your visitor. What do I get if I book you for an event? What's your value proposition, the benefits of your particular service and how can you differentiate yourself from the competition. A great place to start is to speak to your last 10 customers and find out why they hired you, what were the things that convinced them to hire you, what were the concerns/doubts they  the had? I'm guessing here (you'll need to talk to your real customers) but if I was hiring you for my wedding, I wouldn't be so worried about the price, or the quality of your routines (I don't know what ground-breaking magic is!) but more concerned with questions like: "What if it's all going to be a bit cheesy?" Is this going to annoy my guests? Is it going to be intrusive? Can he work with the venue? Can the performance be tailored to the theme of my event or the location? If you can figure our what really matters to people you can quickly put them at ease and even turn these concerns into benefits. You might want to also look at how you're using images. It can be hard on the ego, but it's not you that's the important thing here - if you can show more of the reactions and atmosphere that you create then that may help people fell that "yes,  I want some of that for my wedding/party etc" Don't bury your testimonials away on a testimonials page. You've got some great comments there about "delighting guests", "making birthdays special"... I'd use those on your relevant pages. (Personally I think they're more compelling than the "celeb" testimonials.) Segment your customers and work that group's particular needs/concerns. I'm sure you know the kind of specific issues that come up when your dealing with corporate customers. I really do think it would help to write the content in the first person, using as natural language as possible. As it stand, the site comes across a bit cold, and doesn't let your personality come across. Hope this helps.

    | DougRoberts
    0