Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Search Engine Trends

Explore current search engine trends with fellow SEOs.


  • Hi VTmoz, We didn´t use to disavow a lot since most clients backlinks were done by us and always stayed clear from the "dark side of the moon".  We stopped altogether though after last year pinguin´s 4.0 update.

    | Moreleads
    1

  • Hi Thanks for this! Yes speed I have also raised with our development team in France - again I have no control over when they work on this or how. I can only make improvements locally. Our site is Key.co.uk I want to get https implemented across the site, but again this relies on our dev team. For 2 years I've focused on properly optimising product titles/meta & descriptions & this has helped but I've hit a wall. Our competitors don't seem to be building backlinks, they don't have more content than us & I just cant find a reason for them ranking so well.

    | BeckyKey
    0

  • I agree with david, this may have been a more effective approach when exact match keywords in domains were weighted heavily, but that is not the case anymore. You would risk, and probably lose more value than you would gain from this transition

    | Packaging-Group
    0

  • Google wants us to believe they are presenting results based on intent and not keywords. In adwords google doesn´t even let us target singulars and plurals directly anymore and keyword search data is presented equal for both. In practice i see however that a lot is still based on keywords, your example is one of them. I think both have a similiar, if not same intent, so creating seperate pages doesn´t make sense in my opinion, we usually try to use these variations in the page content as some pages are able to rank for both.

    | Moreleads
    0

  • Google usually pushes new sites to the top spots not because they will rank there, but because they are new. Maybe now that your site isn't new anymore Google removed it from the top spots... I'm just guessing here... Have you done some link building in the last 3 month? Anything blackhat?

    | FedeEinhorn
    0

  • Hi, Occasionally you might see Google does this where they will return a URL and as part of the description, show a hashed anchor - but they won't show just an anchor as a result in its own right. It will always form part of the page that Google returns. Worth doing if you think it might help usability but it wouldn't fit for every page and circumstance. Have seen it used to good effect, but am seeing less and less of these. -Andy

    | Andy.Drinkwater
    0

  • Hi, For ease of use, I always try to advise that any link is updated if it is clear that it has been redirected, but unless the subject has changed, it really shouldn't make any difference. If you have a link from your site to an article about breeding caterpillars and the target site then change this to something about feeding habits of polar bears, this is a link I would change. Top and bottom line, if the link makes sense, you will be fine. -Andy

    | Andy.Drinkwater
    0

  • Google is incredibly good at entity detection (i.e. figuring out when someone is searching for a "thing"). When there is a canonical right answer for that thing (e.g. the company website in the case of a company), it will often rank very well despite not necessarily having any of the "traditional" ranking factors in place. Typically, when the search is unambiguously for the thing in question, the company will rank #1 even if there are much stronger websites and pages about that company. You will rarely find that a company is outranked on their own company name when it's sufficiently distinguished from other entities even if it's a really small company and there are (for example) major media stories about it. There are a variety of other factors - like query sequences (e.g. a user searching [seo] followed by [seo trucks] in your example) that Google can use to associate more specific sites with more general queries as well. Most importantly - I'm not sure there's a great deal you can do about it, not any general lessons you can apply to your own attempts in this market / keyword space - so I wouldn't spend too long worrying about it!

    | willcritchlow
    0

  • We have never had to nofollow internal links within any of our websites, I am sure their are some circumstances where you could, but since Google's algorithm no longer "penalizes" links I don't see a need to as they are simply ignored. Just avoid spammy tactics like having 100 links under the footer or a bunch of affiliate links, etc.

    | LureCreative
    0

  • Hey David, I thought I'd jump in here, as it's our tool We have more information on bot verification here, including a troubleshooting section with common issues for genuine events being marked as spoofed - https://www.screamingfrog.co.uk/log-file-analyser/user-guide/configuration/#verify-bots You can also reach us via our support here - https://www.screamingfrog.co.uk/log-file-analyser/support/ Cheers. Dan

    | screamingfrog
    0

  • Good questions. I would assume it could be due to alot of uniquely generated content in a short amount of time that Google could tell was real and that page probably had a high CTR and Time on Page metrics, etc. Another theory could be that the discussion got linked to or shared somewhere that increased the discussion's thread authority in Google's eyes. It likely would have not ranked high long-term, and Google could have almost treated it like a timely, relevant news story.

    | LureCreative
    0

  • Hi. Check how your organic traffic compares. If it's farely steady, no reason to worry. It may simply be that  your referral traffic has reached its "peak" in terms of new visitors they can send.

    | andy.bigbangthemes
    0

  • Hi vtmoz You should definitely have pages that target brand related queries and this is not spam. I don't know what you sell. But you should have 'landing pages' I prefer to call them 'categories' for all the 'things' 'services' or 'product categories' your company covers. Make sure that the pages are tight and succinct, have 300-1000 words content and are highly contextually targetted to the different categories you cover. This is not spam, it is normal site structure and you will not be penalised. Regards Nigel

    | Nigel_Carr
    0

  • Hi, Yes you are right CTR (click through rate) is important factor in SEO too. If users clicks on your site's link in SERPS then Google will think that your page is relevant and you will get better rank. Thanks

    | Alick300
    0

  • Hi vtmoz, It's very common for Google to change its algorithm in response to user trends. The thing to remember is that users have different forms of intent when they use certain words or phrases, and Google is constantly altering their ranking methods to reflect this. While LSI keywords tend to remain static, there can be alterations and shifts within the industry or in how people observe and interact with it. However, these changes would not necessarily occur because companies begin using them more frequently. More likely, Google is dipping into its reservoir of big data to determine that user queries leading to certain pages were not producing user satisfaction (i.e. bounce metrics were high on pages that were previously identified as supplying relevance via LSI keywords) and therefore made a change to better reflect what users were searching for (and interacting with) from their SERPs. A couple of questions to ask: Has anything changed within your industry that would cause an LSI keyword shift? (new products, new competitors, new rules and regulations, etc?) Is there a pattern in terms of the keywords that have changed? Is it industry-wide or a specific segment? Are there new ways users may be interacting with the industry? New queries being used? What is the impact on your rankings for general terms related to those LSI terms/phrases? Based on the answers to these questions, you can better identify whether it's a shift from Google altering the LSI algorithm for your industry, or simply an indicator of a developing industry. My guess is the latter. Hope this helps - feel free to reach out any time if you need a clarification or just want to chat! Thanks, Rob

    | RobCairns
    0

  • I can tell you what we do. With over 35 000 posts we are always tweaking the better ones and dropping the dead weight. I'm currently going through a bunch of posts from 2010. I run two quick tests on them - I check the PA [page authority] on Moz. If it's 1 then that's one strike, anything higher I consider working on the post. Next is a quick check in Google Analytics for traffic over the past 6 months. As you can imagine many posts from 7 years ago have 0 traffic. This is strike 2 and in my ballpark, 2 strikes means 'you're out!'. I delete the posts, a hard 404. As we cut the driftwood from our nets I feel we will be more efficient  at catching more fish.

    | BrvceTHW
    0

  • Hi there, Yes, the overall website situation impacts particular pages. Shortly: It's not only about how the particular page is optimised, it's also considered what it's surrounded by - i.e. other pages / rest of the website. So when you analyse your page, you should also analyse your website to see bounce rates, or time on site for example. If you want to improve pages, you should look at its complexity. I don't believe you can succeed long-term if you have 5% of over-optimised pages and 95% don't even rank. Google is becoming more and more clever. Try to optimise those 95% of pages (or at least some of them), get them some backlinks and you'll see DA growing and it should have positive impact on PA of your focus pages. I hope this helps. Good luck! Katarina

    | Katarina-Borovska
    0