Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hey Alberto, If those 44 posts are set to "private" how is Moz finding them? Somewhere in your blog you are linking to them. Find those links and take them down. Then you should consider 2 things: Do those pages get any organic traffic? If yes, try to create a page that isn't a copy, with unique and useful content and set it as public. If there's no organic traffic, then just leave the 404 and just make sure your 404 is a nicely designed 404 that offers some value to the user. Not only for these 44 pages, but to cover any 404 issue in the future too. You could, for example, check the URL requested and search our site for related content and while saying that the page they were looking for is missing/removed, you suggest looking those other pages. Hope that helps!

    | FedeEinhorn
    0

  • Are you using the hreflang correctly? it contains both the language and location. If, for example, you french version is being served with the hreflang="fr" that means that users with a French browser should see that page, but not that users in France should see that page. If you are running an German browser, searching in France, you will probably get the German version first, as the hreflang="de" it's doing it's job. If you'd like to have the french version even for German browsers, you will probably need to add an hreflang="de-FR" and point to the french version. Even adding all the hreflang tags you may need, Google will ultimately serve what they think is better to the user. There's no way to "force" them, hreflang is just a "guidance"

    | FedeEinhorn
    0

  • Thanks for clarifying.  Each site needs it's own author page.  I am comfortable with creating reciprocity with GG+.  Thanks!

    | seagreen
    0

  • Hey Shaun, I wouldn't be so concerned with the length of the URL (see this Moz page outlining that URL length is second from the bottom in search ranking factors http://moz.com/search-ranking-factors according to 120 leading search marketers who participated) as you could put more efforts and time towards some of the other On-Page SEO items from the list to implement. Just make sure your Categories, Archives and Tags from the blog are marked off as NoIndex/NoFollow. If you are using the All In One SEO plugin for WordPress (which I recommend), then you just have to check a couple boxes on the Settings page to accomplish this task to make the blog more SEO/search engine friendly. Hope this was helpful! - Patrick

    | WhiteboardCreations
    0

  • Hi there, has your question been resolved? We would love an update, thanks! Christy

    | Christy-Correll
    0

  • Rand talks about this here http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful Key word is "may" be penalized.  If you follow Forbes example, you should be ok.  All due respect though, you are no Forbes and so while users may put up with it there, they may not like it on your site.  Watch your analytics as you may see users leaving your site more quickly.

    | CleverPhD
    0

  • Thanks, Peter!  I knew when taking this image, it would come in handy for SEO purposes

    | customerparadigm.com
    0

  • You just should no index the archive pages, tags, and others that may have duplicate content. Let's take a blog for example. You let spiders index and follow all links from your homepage and the that pagination, however, then each post is filed in a category, maybe including tags. Then you have to decide, and it's really up to you, what to let spiders index (always use follow, there's no reason to use nofollow). I, personally, will allow spiders index the homepage (including pagination), the categories (and their pagination), but I'll put a noindex in the tags and archives (specific pages that show posts within a specific date). Still, allowing all to be followed.

    | FedeEinhorn
    0

  • Thanks for taking a look anyway Chris. Does anybody else have any suggestions on what might be going on here? Cheers, Greg

    | G.Anderson
    0

  • Thanks for the clarification Peter - appreciated! Luke

    | McTaggart
    0

  • Well, that's hard as I don't know your URLs and your parameters. You need to come up with a solution that covers all, but also avoid any duplicate content issues by redirecting the parameter based URL to rewrote ones. Let's say the file serving paintball masks and goggles is: Paintball-Masks-And-Goggles-0Y.aspx The parameter Manufacturer only shows the ones from that manufacturer. But does the naked URL shows all? If yes, then you have to noindex all of those with the parameter set. If no, then you can use some URL rewrite rules to make "static/easy to read URLs" using something like this: RewriteRule ^Paintball-Masks-And-Goggles/(.*)$ Paintball-Masks-And-Goggles-0Y.aspx?Manufacturer=$1 [L] This means that users accessing to /Paintball-Masks-And-Goggles/Empire will see the same page as /Paintball-Masks-And-Goggles-0Y.aspx?Manufacturer=Empire but on a friendlier way. That is if you have lots of manufacturers for paintball masks and goggles. Or if you have many manufacturers but not that many products from each, you can also write a different rule like: RewriteRule ^(.*)/Paintball-Masks-And-Goggles$ Paintball-Masks-And-Goggles-0Y.aspx?Manufacturer=$1 [L] Which will produce the same effects, but putting the manufacturer name on the from of the URL: /Empire/Paintball-Masks-And-Goggles. This not only involves creating some set or rewrite rules but also changing the code in your site to use the new URL structure you are creating with the rewrite rules. If you don't have the knowledge to make this kind of changes, I suggest you contact a web developer to carry on all the necessary steps. Feel free to private message me if you need more help.

    | FedeEinhorn
    1

  • Hey There I would download all of your link data from; webmaster tools ose majestic maybe ahref too And pull it together and comb through it for bad links. I think you'll really have to look through them to see what's going on. Maybe something was missed? First you need to confirm there actually are no spammy/bad links In a removal / disavow situation the goal is to remove/disavow ONLY bad links - which there could only be 10 out of 100's - so you should sort through them. -Dan

    | evolvingSEO
    0

  • Hi, the Moz assessment of keyword difficulty for "send SMS online" is 62% which is defined as follows: Highly Competitive Powerful sites with strong pages tend to dominate these results. Links in quantity and quality (at both the domain and page level) are required to earn top rankings. I think if you don't have a site established and a track history for this already, then your chances of ranking well enough to earn traffic from search engine clicks is small to zero. If you have a product to sell for this term then my advice is that you are better off spending any money you have to pursue this by using Google Adwords. But then at the USD 2.12 cost you listed, then it's possible you won't see a return there unless you are hoping to sell packages for sending SMS online in high volumes. I hope that helps, Peter

    | crackingmedia
    0

  • Sanaa, What that does is actually make for a new "sub" domain that is distinct from the pennies.com domain.  Depending on how closely tied a subdomain is with the root domain (via links and topics), the subdomain and pages within it all rank independently of the root domain.  There are wide variations in the degree of independence a subdomain has with the root but in answer to your question, it doesn't "hurt" SEO but it does require a different strategy to achieve desired results.  Here's some lit on the whole domain topic for you: http://moz.com/learn/seo/domain

    | Chris.Menke
    0

  • I agree with Chris. It's not just a case of ranking for a keyword, you've got to be able to do something with that traffic in order to satisfy the goals of your site/business. So, you need to consider relevancy of the page, the question the searcher is asking and their intent. It's tempting to try and use your most authoritative page (typically your home page) to try and compete for your competitive search terms. But you can't burden your home page with every keyword. Really you should be looking to get the searcher to the most relevant page on your site for their query. The more specific the keyword, the more specific the page. Think about which page is more likely to satisfy the searchers query, and have a higher chance of getting them to convert? A generic home page or one that's about the very specific thing they were searching for? Target keywords with the wrong page, and you can see some high bounce rates. It's just a shame in this (not-provided) world we're losing the ability to track the bounce rates for specific keywords.

    | DougRoberts
    0

  • 100% agree with Alan here  as the purpose of rel=canonical implementation is to hint Google about the non-preferred pages (the duplicate or near duplicate pages of their original versions or the preferred ones) and not to index them. Best, Devanur Rafi

    | Devanur-Rafi
    0

  • Thank you so much!  I don't really like the way the 3rd party vendor set up the site (the IDX supplier). Do you know of any good IDX site providers?  Thanks again.

    | mrodriguez1440
    0

  • Kane, I could not agree with you more. For instance I have a client that currently has 26,000 visitors a day because of a awesome campaign that went viral could not be happier. One page that contains the news story receives 10,000 visitors. While the homepage might get much of the direct traffic from all the news sources online there is a call to action to view the specialty item. Sorry I can't get more into it. Either way we just picked up traffic immensely and because of the amount of social sharing along with the incredible link velocity being pushed to the homepage and the item page only granite the item page is starting to get more links and the shares are beyond what the home page is by tens of thousands. What I'm getting at is because of the links and the social aspect the fact that this keeps snowballing is something I'm extremely proud of. We are going to be on the largest news network morning show next week and I anticipate this is just the beginning of the attention focused on just to pages 1 being the home. So obviously I know you are right the more links and social shares the better the page will rank I could write gobbledygook is the title and it would still Get insane traffic. I hope that helps, Thomas

    | BlueprintMarketing
    0

  • I'd second Federico Einhorn...  He's correct about the description tags. I did check the robot.txt file (looks okay) And the meta info also looks okay, too: <meta name="<a class="attribute-value">google-site-verification</a>" content="<a class="attribute-value">ruda0O6aWmbMZZc-UyJEYH4lx6e8T41glM3QIo_Ae7Y</a>" /><meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">index,follow</a>" /><meta name="<a class="attribute-value">GOOGLEBOT</a>" content="<a class="attribute-value">index,follow</a>" /> But I agree that it might not be included in Google Webmaster tools correctly, due to this message: Google promotion Try Google Webmaster Tools <cite>www.google.com/webmasters/</cite> Do you own shop.riversideexports.com? Get indexing and ranking data from Google. (see screenshot) google-site-shop-riverside-exports.com.jpg

    | customerparadigm.com
    0