Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Search Engine Trends

Explore current search engine trends with fellow SEOs.


  • Kashif, it may be time for you to go out and hire a firm to do an audit of your site. We have a list of recommended companies at http://moz.com/article/recommended. Also, please check your email associated with your account. I've sent you an important email regarding your account, and want to make sure that you see it.

    | KeriMorgret
    0

  • Google doesn't always tell you when you when they take action on a site.  It could have been some minor algorithm tweak that put you over the edge in some capacity.  We took over two clients who had a massive amount of linking between their own websites.  This randomly caused a penalty.  I'm 99% sure this is the cause because removing them fixed it, putting them back broke it again and removing it the final time fixed it for good. Maybe look at your link profile for excessive site wides, or any over optimization.

    | WhoWuddaThunk
    0

  • No problem Gordon. This is just me providing an example which can be modelled next to any project, however big or small.

    | GaryVictory
    0

  • I checked Moz's Open Site Explorer and Ahrefs which are both good sources of backlink data. Structured data is a nice thing to have but wouldn't necessarily hurt your ranking.  It can help Google more easily make sense of your content and also help you stand out a bit more in the SERPs if Google chooses to show a rich snippet result for you. Getting rid of bad backlinks can be a manual task of reaching out and contacting webmasters and there are some tools that can make that process a little less time consuming.  However, I didn't review your links for quality, just noting there were a lot of links from a small number of domains.  The top one looked like the personal site of the company owner or broker.

    | mosquitohawk
    0

  • I'm pretty sure it was February, when Yandex announced that they will no longer be using links as an SEO ranking factor. This was expected to hit March this year sometime - I am guessing this will be what has happened. -Andy

    | Andy.Drinkwater
    0

  • Hi, Yes i am fairly sure as a previous SEO company caused all the bad linking. This website has all been organic growth and links. Page visits are down by half, time on site is down by almost half. Is something else a miss? Are rankings for our biggest keyword in a competitive sector is still at 3 on Google UK, so i dont think its a penalty.It could be a browser issue or conflict, i just dont know! the site is recyclingbins dot co.uk Thanks for your help

    | imrubbish
    0

  • Hi Paul, I suggest you read this read at the Google and Your Business Forum: https://productforums.google.com/forum/#!searchin/business/don$27t$20have$20login|sort:date/business/4xaVk1BwuwE/vGMWweSeo2cJ Hope it will help!

    | MiriamEllis
    0

  • Hello, Yes google uses many signals to identify the related searches for a query, but the most important factor is user behaviour on google searches. When many users type "x" and then are not satisfied with resutls they browsed, than they type "Y". Google will identify "y" as a related search for "X". So influence related searches for G, you have 2 solutions: 1. Push/guide users to search for Y query after X. 2. Code a script that can do some manipulation (grey hat stuff, i did it 2 times for some clientd of mine ). If you have further questions, get back to me. Thanks

    | rikano
    0

  • Hi Kate, Thanks for your response. We submitted the XML site map. And yes, we did a couple of press releases. We will wait for this to improve.

    | Cloudguru99
    0

  • Hi Peter, The problem here is most likely that you have two nearly-identical sites. It looks like you have tried to re-write content between pages like http://www.gilnahirktyres.co.uk/services/mot-belfast and http://www.belfasttyres.co.uk/services/mot-belfast, but the content is still incredibly similar. Google will still see this as duplicated: the structure of the sites is nearly identical, the paragraph structures are the same, and Google has taken a lot of measures to be able to identify this sort of thing. The reason they have focused on this is because of how easy it was to "spin" content with synonyms, etc. and have it rank well over and over again. Also, http://www.belfasttyres.co.uk/robots.txt does not block all of this duplicate content either. For users' sake, Google does not want to rank the same content more than once so they weed out duplicates that are this extreme. They want to present diverse results that give people as much opportunity to choose as possible. You will need to focus on using just one of these sites, rather than both. It's interesting that the problem is only showing itself on mobile search right now, but I would not be surprised if this rolls out to the index at large.

    | JaneCopland
    0

  • Lots of people come here and say.... "I added millions of pages to my site and my traffic dropped." The answer is usually.... the millions of pages were thin content, duplicate content, cookie cutter content or rubbish content and now google sees the site as low quality.

    | EGOL
    0

  • Yes you are 100% right in my opinion.  The whole reason I believe for the shift in brand preference is a play on generating more ad spend via Adwords.  The search results are so unhelpful for so many searches nowadays. For example, search for "Trademark Registration" on Google.  That SERP used to be full of attorneys and websites offering that service.  Now it's mostly government websites forcing the attorneys to buy ads for this highly searched term.

    | mosquitohawk
    0

  • Great idea using Screaming Frog first then :- Then use Google Web Master Tools > Crawl > Fetch as Google to see what Google sees

    | danwebman
    0

  • If you haven't yet, please see my follow-up post: http://moz.com/blog/new-title-tag-guidelines-preview-tool This is a moving target, and it's actually a pixel width (512px), but I tried to take a data-driven approach, and as best I can measure, 55 characters is a safe limit about 95% of the time. I will add that Google definitely processes characters beyond that limit (some are even in the source code) and words beyond that limit could count toward ranking. They won't count much, I strongly suspect, but this new limit doesn't mean you automatically have to cut everything shorter. There's certainly no penalty for going over, as long as you're not keyword-stuffing to extremes. One down side is that the new method (using CSS for the cut-off) means that Google now cuts mid-word, instead of between words. This could be more detrimental to CTR, in my opinion. It's very situational, though. The best I can say is to look at your most important title tags in the context of real searches and make your own judgment call.

    | Dr-Pete
    0

  • For the schema.org markup, you'll find Google's schema tester here. I would use schema.org/LocalBusiness but you can see an example of how to do the markup on one of my sites.  Start at line 447 of the source where you see schema.org/Corporation, and use LocalBusiness instead of Corporation.  I think most of the other fields will be the same, but refer to the link to schema.org/LocalBusiness above to be sure. In my example, I've used a CSS class schemaorghide that's got display:none in it, as while I want to include a bunch of these fields in the schema I show Google, I don't necessarily want to show all of them in the footer (and note that some of the fields are elements of multiple entities within Corporation, so I don't want to show my phone number twice, for instance). Pay attention to all of the places you see itemscope or itemprop.  And it's all done where you see the comment:

    | MichaelC-15022
    0

  • I have a few of the new TLD's and love the fact you can get something that really says what you do, but you still have to be quick. I have a live site at www.seoconsultant.guru, and got very lucky with www.changing.careers They give some real change to the more standard .com's etc. -Andy

    | Andy.Drinkwater
    0

  • Hi Matthew, I would start with a simple one, and fix the Title tag. At the moment, it says "Gerald Miller Super Lawyer: Minneapolis DWI Lawyer & Attorney". I would change this to something like "Minneapolis DWI Lawyer & Attorney | Gerald Miller". I would drop the 'super lawyer' in favor of something like "professional". That said, if people search for the phrase "Super Lawyer", then you need to re-think this. I would also drop the Meta Keywords as Google doesn't use them at all, and worst case, they could be seen as an attempt at keyword spam. In fact, if you have these when doing a page analysis through the MOZ page grader, these come up as a negative SEO point. Something to get you started -Andy

    | Andy.Drinkwater
    0