Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Search Engine Trends

Explore current search engine trends with fellow SEOs.


  • Thanks, Ruth. I think I'll do this -- double check to make sure authorship is correct (which is, quite frankly, not easy to do!) and just take it easy on the links I put back to my site in any upcoming HuffPo columns. I appreciate the helpful feedback!

    | TomNYC
    0

  • If you use the words Weber 522053 in the query, Google is likely to return the title with Weber 522053 in it, even if it doesn't return it that way for a different query, because Google sees that Weber 522053 is important to the querier. But this does show that Google knows what the intended title is and is shortening it for its own unfathomable Google-ish reasons. (I did notice that Weber 522053 is not visible on the page at all, so possibly that is what makes Google think that it is not important information to display in the serps.)

    | Linda-Vassily
    0

  • Hi there No, the canonical will not pass the meta robots directive to the original page, so you're safe there. What you're effectively doing is using two ways to prevent duplication - the canonical will instruct web crawlers not to index versions of the URL with query strings, just as the noindex,nofollow tags will. Nothing wrong with using two methods simultaneously to do this - always a good idea to be safe - and so the end result will be that the URLs with query strings will be very, very unlikely to be indexed.

    | TomRayner
    0

  • I've usually seen it in cases like what Istvan mentioned - somehow, another signal comes into play. Maybe it's new links to the non-canonical URLs, maybe some internal pages with old links get crawled, maybe a new 301 or canonical comes into play that conflicts with the existing canonical. If they're being ignore now, then it's possible you're using the canonical tag as a band-aid, for lack of a better term, and the underlying problem that caused the duplicates is still in play. If Google's really being indecisive, you may want to take a closer look at that underlying problem and not just rely on canonicals. Generally, the tag is pretty strong, but Google does get it wrong from time to time. Sorry, it's hard to advise based on generalities. The devil is in the details on these situations, I find.

    | Dr-Pete
    0

  • I rebuilt a site and got it back in the running with Google a year ago. They have contracted nothing done since then. It has dropped about 9 or ten points in Google - but recently fell out of the top 50 in Bing/Yahoo. Once out of the top 50 - who knows where, off the map, they are! I have noticed other sites that have dropped significantly from formerly similar rankings to Google. In Many Cases: Google stays up - Bing/Yahoo goes significantly down

    | dcmike
    0

  • If your niche site is working better for you right now there seems little point in 301-ing it to your main site. Not only do you risk losing rankings (as your main site may not rank for the terms your niche site used to), you've already said your main site doesn't convert as well - so even if you get the main site ranking in place of the niche site, you'll still likely lose sales. If I were you I'd keep the two sites for now, as Gary says don't interlink them and try to make sure the content isn't exactly the same. However, I would encourage you to consider what you might want to do in the future. Where do you want to invest your time and focus your attention? Do you want to improve your main site, or your niche site? Continuing to invest in both (i.e. dividing your resources) may mean you're left with two sites which are just 'ok' rather than one really great site which might not be the smartest move long term.

    | Hannah_Smith
    0

  • Is the perceived authority of the old-time TLDs something worth investing in? Or will this fizzle away over time as the new gTLDs flood the market? .com mainly, but also .org (and in some cases .net) have survived the test of time in terms of authority/brand/image. Many other TLDs were introduced prior to .PRO and the above are still highly sought after. In terms of financials, as in "The .org will probably end up being about 10x the price of the .pro." from an SEO standpoint its a fair game, so if you can invest the savings from a .pro (assuming significant if you are paying high bucks for a .org that you are buying) you can use that for SEO. Hope this helps

    | vmialik
    0

  • Hi Vjay, There's a bit more info here too from an earlier question on this back in June: http://moz.com/community/q/how-can-you-leverage-google-s-knowledge-graph-to-gain-more-visibility-in-the-serps Peter

    | crackingmedia
    0

  • Michael, Great question.  I always look at Google's Keyword planner to see how it treats it.  Couple things I noticed.  1) Google doesn't see a difference between lower-case and uppercase "E" in e-file.  2) Google lists 30 exact match searches for 1099 e-file software and none for 1099 efile software. I would probably go with "e-file" as the target keyword, and perhaps use "efile" in the text and alt-image text, just as a backup. Eric

    | TopFloor
    0

  • I would have to second Gary Lee on this one. Chris as great info as well but like Chris is saying you probably need to do some rewriting.

    | NateStewart
    0

  • That's great to hear Grant Best of luck going forward.

    | gazzerman1
    0

  • According to Moz my anchor text for 'balloon' is from 3 root domains and 4 links containing anchor text?

    | balloon.co.uk
    0

  • i think the case can be made for any of the three main formats.  all have their pros & cons, but Google recommends schema.org microdata https://support.google.com/webmasters/answer/99170?hl=en structured data offers search engines more information about the site.  more information they can use to evaluate the relevancy of your site to a query as well as the depth of content for a richer snippet. as far as it negatively affecting SEO, i would say take the same precautions as you would with any other tactic.  implement it as cleanly & honestly as possible.  if it's done with (perceived) manipulation for the sole purpose of better rankings it can negatively affect the a website's SEO.  but that's just a good rule of thumb regardless of the tactic.  

    | EmpireToday
    1

  • Hi guys, Thank you for your great responses! It is actually what I was thinking, nothing to do with the implementation of rich snippet but because Miriam mentioned that you're not supposed to take reviews from third parties then I am little concerned about this. So my question is then where are you supposed to take them from? What is the purpose of the review rich snippet then? I actually took those reviews from Customer lobby and they aren't duplicating anywhere! @Takeshi: The implementation of the rich snippets were made last week and checked the rankings around the same time. Thanks!

    | Ideas-Money-Art
    0

  • Interesting, thanks for your insight as always EGOL. Upon further research I have found a few double listings but they have been for specific software and the double listings are of the developer's domain. So that makes sense to me. Either way it seems the algo is making exceptions for certain domains depending on keyword and their authority to the actual search term.

    | jesse-landry
    1

  • The problem is not that they outrank me, but that my site disappears from the search results. No we did not get spam links from any sex sites, we do however have a lot of links from our other sites (same C block). The traditional spamy links were not used...

    | Spletnafuzija
    0

  • Hi Alex, I'm going to write from a Local SEO perspective because this is what I know, but I hope you'll get lots of feedback from traditional SEOs as well to your good question. Yes, for some years now, Google has been localizing more and more results. If you are searching from a Miami-based device for any term which Google perceives to have a local intent, they will typically show you results that are geographically local to you. Google does not handle all terms this way. For example, if your website is about Abraham Lincoln or scotty dogs, searches for these things are unlikely to trigger local results. But, if you are searching for shoes, furniture, cell phones, etc., then it is quite likely that Google will presume that you are looking for a local resource and will localize the results to you. This is the case for countless terms that Google has deemed to be local, and for good or ill, this has put national, non-local business owners at something of a disadvantage. Simultaneously, if you are in Miami and your national client is in Denver, you will both be seeing different results. In other words, you will be seeing results that are local to Miami and he will be seeing results that are local to Denver. Because of this, there are no 'standard' results or firm rankings. In sum, what you are experiencing is a common phenomenon. I would expect that national business owners struggle to build enough authority so that they might be included in these perceived local results, overcoming Google's bias. Given this, I'll leave off here in hopes that some of our expert traditional SEOs can explain if/how they are overcoming this bias. I hope my explanation has helped you to see that you are experiencing something quite real and quite common.

    | MiriamEllis
    0