Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: White Hat / Black Hat SEO

Dig into white hat and black hat SEO trends.


  • Is there more value in spending 3-6 months in doing the coding plus content for several sites, or would there be more value in spending that effort in developing content for your main site? Which is more defensible when the search engines update algorithms? I'd vote for spending the effort on my main site rather than other sites.

    | KeriMorgret
    0

  • "Referral Spam" Can be seen in your web stats program, like Google Analytics. When webmasters or site owners log in to view their stats, they see a referral from a website (usually a shortened URL). The person then follows that link and the spammer gets more eyeballs to their site. If the spammer is a service provider, their client might be paying per 1000 visitors (performance-based pay scale). So, getting x number of views to the site might mean they get paid a certain $ amount. The other thing the spammer might get is ... tricking someone into buying their product or service. Scott O.

    | OrionGroup
    0

  • My plan is to move the blogs and automated content to site A.  I will be changing the design of site B.  Hopefully google will reward me moving forward.

    | CLTMichael
    1

  • Thanks to everyone who commented on this! Meta, your answer seems to have valid points on different levels. I appreciate the insight!

    | MorganPorter
    0

  • I pick up what you are putting down... thanks for the clarification Frank. Here is how I think things would play out... If you are creating /super-awesome-best-thing-ever as a new page that does not exist yet on your site, it will have very little rank. It if you redirect site.com to site.com/super-awesome-best-thing-ever you will have a partial drop in page rank due to the 301 redirect. Just because you are redirecting your homepage to the /super-awesome-bet-thing-ever does not mean it will get the full power of your old homepage, site.com. You would also need to change all of your internal links to point to site.com/super-awesome-best-thing-ever instead of site.com, because you'd lose some of your link juice flow if you also used the redirect internally. And if you use a 301 redirect, you are telling Google that this change is permanent - inferring that you are no longer going to use your domain of site.com. Does that mean Google would remove the URL site.com from the index... I don't know. I do know that SEOmoz places keyword usage in the URL under the "moderately important" section of its on page analysis - that said, I wouldn't go to trouble of this. I think you are better off optimizing your homepage for the keyword or creating a /super-awesome-best-thing-ever landing page. Hope this helps. Mike

    | Mike.Goracke
    0

  • Hi Virginia This is frustrating indeed as it certainly doesn't look like you've used duplicate content in a malicious way. To understand why Google might be seeing these pages as duplicate content, let's take a look at the pages through the Google bot's eyes: Google Crawl for page 1 Google Crawl for page 2 What you'll see here is that Google is reading the entirety of both pages, with the only difference being a logo that it can't see and a name + postal address.  The rest of the page is duplicate.  This should point out that Google reads things like site navigation menus and footers and interprets them, for the purpose of Panda, as "content". This doesn't mean that you should have a different navigation on every page (that wouldn't be feasible). But it does mean that you need to have enough unique content on each page to show Google that the pages are not duplicate and contain content.  I can't give you a % on this, but let's say roughly content that is 300-400 words long would do the trick. Now, this might be feasible for some of your pages, but for the two pages you've linked to above, there simply isn't enough you could write about.  Similarly, because the URL generates a random query for each employer, you could potentially have hundreds or thousands of pages you'd need to add content to, which is a hell of a lot of work. So here's what I'd do.  I'd get a list of each URL on your site that could be seen as "duplicate" content, like the ones above.  Be as harsh in judging this as Google would be.  I'd then decide whether you can add further content to these pages or not.  For description pages or "about us" pages, you can perhaps add a bit more.  For URLs like the ones above, you should do the following: In the header of each of these URLs you've identified, add this code: This tells the Googlebot not to crawl or index the URLs. In doing that, it won't rank it in the index and it won't see it as duplicate content.  This would be perfect for the URLs you've given above as I very much doubt you'd ever want to rank these pages, so you can safely noindex and nofollow them.  Furthermore, as these URLs are created from queries, I am assuming that you may have one "master" page that the URLs are generated from.  This may mean that you would only need to add the meta code to this one page for it to apply to all of them.  I'm not certain on this and you should clarify with your developers and/or whoever runs your CMS.  The important thing, however, is to have the meta tags applied to all those duplicate content URLs that you don't want to rank for.  For those that you do want to rank for, you will need to add more unique content to those pages in order to stop it being flagged as duplicate. As always, there's a great Moz post on how to deal with duplication issues right here. Hope this helps Virginia and if you have any more questions, feel free to ask me!

    | TomRayner
    0

  • Thanks Colin. Your response clarifies "manual" warnings. I have wondered about the difference between "manual" penalties and receiving an email warning in GWT. I understand you to be saying those are one and the same. Yes? I am still hoping to learn more about the methods by which a couple of specific keywords might be penalized if no warning was received. I know this specific keyword issue is not related to inbound links in this case.

    | gfiedel
    0

  • Normally if Google didn't send any warning of unnaatural linking to your site you're not supposed to send a disavowing request since this just raise the attention goole is paying to your site, however I unserstand you want to prevent any issue by acting before any warning message. About the disavow request, if that've been received google should have take that into consideration. You said that that page was an offending page so you want to 404 it. I think that if you're getting rid of that page returning a 404 you're preventing that (bad) value to be added to your domain. I think that this is a correct approach. Just be sure that those 879 links are all bad because you don't want to lose all that value. If there are good links pointing there just ask those webmasters to point to your homepage instead. Just be sure that if there was some good content inthat page it has moved to your home before asking a link to be changed.

    | mememax
    0
  • This topic is deleted!

    0

  • Hi Mark, I personally would run a clean 301 to the new site, then use something like $_SERVER['HTTP_REFERER'] (in PHP) to determine where the user came from.  If it's from the old site, run a small banner in the header (like Hello bar) to advise your users of the change. This would be a lot cleaner for search engines and very user friendly. Don't forget to transfer the sites worth using webmasters... Hope this helps. Dan

    | djlaidler
    0

  • Hi guys, Thanks for the answers. The reason I asked is because a competing website's link building strategy is completely focused on press releases. While they haven't usurped the #1 or #2 SERPs, they have managed to place themselves solidly as #3 in a short amount of time. Sometimes what we're hearing from the Search Engines and what actually happens doesn't always line up. I started this discussion to hear what people think and to find thoughtful new ideas to why this may be happening. Appreciate the feedback!

    | steelintheair
    1

  • I market on Craigslist. Shhhhh don't tell anyone.... Yes, it's a Nofollow.

    | Francisco_Meza
    0

  • It's not too late for you to start your next career, Robert. You know how new ventures can be fun. Or, you can simply do stand-up at the local pub for tips.

    | EGOL
    2

  • You need to get creative. Use dynamic content. It may be harder for coupon sites but the best one's manage to make it happen. If you can't get 4-10, try to get at least a couple.

    | TextMarketing
    0

  • I use PRWeb.com and opt for the premium service ($360 per release - or you can get a Vocus (PRWeb's parent company) business plan if you send out releases frequently - good savings. I also don't abuse the various options you can set up with each release for SEO but instead choose the ones that make the most sense for proper relevance. Ideally, if you have the resources and budget to afford it, yes, it's best to work on making real connections with actual editorial desks at various channels and organizations, because that's where the golden nuggets are that will most likely lead to coverage.

    | AlanBleiweiss
    0

  • I can't say for sure if this would hurt you, but I know that we saw a definite decline in ROI after after an extended period of doing nothing but press-releases for our link building. It's probably a good idea to mix things up and go after some different link types if you can. Our strategy is to go after the high-value, high-investment links first. Then read and comment on relevant blogs, submit to niche directories, bookmark content, post guest-blogs, and join new social media communities where it makes sense.

    | AustinRealtor65
    0

  • Give this a read before deciding: The exact match domain playbook.

    | Chris.Menke
    0

  • My authority domains are not about same thing and they have just authority (with out position on SERP )and i really want to increase my new domain authority.

    | vahidafshari45
    0

  • When you say "disavowed", you mean that you've specifically used the disavow tool Google provides? In that case, you shouldn't need to have them re-crawled - my best guess from talking to other SEOs is that the disavow file basically acts as a layer of data on top of the link graph. Now, if you've had links removed, and you want Google to acknowledge the removal, then yes, you'll need to get the pages re-crawled - or else just let time kill them off (but that could take a while). This is tricky, though - you don't necessarily want to promote a spammy page or drive more authority to it. If it was one site you controlled, you could use XML sitemaps, the Webmaster Tools URL submission form, or a service like Ping-O-Matic (http://pingomatic.com/) to nudge Google to re-crawl, but most of those solutions don't work for a bunch of URL from other people's sites. So, you're left building links or somehow drawing attention to them, which can be dangerous. You can promote them in social, too, but again, then you're basically vouching for those pages, and that's not exactly going to build your social accounts. If you're using the GWT disavow, I'd just give it a time. Otherwise, I'd probably try something like pinging (you'll have to hack together a list of the URLs somehow, and maybe publish them to an RSS feed) - I think that's the lowest risk alternative.

    | Dr-Pete
    0