Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Search Engine Trends

Explore current search engine trends with fellow SEOs.


  • Hey Dude, it all (Ranking) depends on your ON PAGE SEO optimization and few healthy backlinks you are giving back to your blog. While you are askingabout caching and indexing, you need to check first if your site is visible to google bot or not, have you submitted your website into Google search console? Do all the needful things to make your site more visible to search engine and try to get few healthy but natural backlinks from the web. Also make sure your sitemap is in xml format and visible and submitted to Google Webmasters tool Hope this will help you to sort out this matter. Sr. SEO executive at Pick Camera

    | tranhainam
    0

  • Appreciate the answer. That was sort of my game plan to pick out our top 4 keywords that we aren't ranked on Google spots 1-3 but would bring us back the most volume of traffic.  I do have four keywords like this that we are are either at the bottom of page 1 or on page 2 & if we obtained a better rank could bring back 11,500 to 30,000 per-keyword. I just didn't want to focus on only 4 keywords by trying to get anchored text links or high quality links by manually reaching out to sites for only those four keywords if there was another plan to distribute more juice to a wider variety of words. But you are correct that is the issue with coming up with a plan, that we have so many keywords some that bring back little traffic some that bring back a lot and how do we focus on the many or should we just focus on the four for now and then focus on a new set once a goal is accomplished.

    | Cfarcher
    1

  • Hi, I had the same question. My web LO+EIR had several pages with and without featured snippets. I even tried out asking Google on GSC to review multiple times the page but sometimes it worked and sometimes it didn't...

    | lomaseir
    1

  • Yep, some similar to previous answer advice: lost opportunity for keywords internal linking not the best link profile. At a very basic you could look for good quality web directories and also try to check practices of your competitors (have you gather information who your competitors are?). Maybe there is an opportunity to get link from sites that links to your competitors. try to create good quality content with statistics and references to external, relevant websites (we would advice setting that as nofollow) what's your website mobile friendliness and speed - these two can significantly harm your website

    | Optimal_Strategies
    1

  • Read what EGOL wrote. It depends upon the nature of your blog pagination There are a few reasons you could have pagination within the blog area of your site Your articles have next buttons and different parts of the article are split across multiple URLs. The content across the paginated elements is distinct Your post feeds are paginated, purely so people can browse to pages of 'older posts' and see what your wrote way back into your archives Your blog posts exist on a single URL, but when users comment on your posts, your individual posts gain paginated iterations so that users to browse multiple pages of UGC comments (as they apply to an individual post) In the case of 2 or 3 it's not necessarry to have unique H1s or Page Titles on such paginated addresses, except under exceptional circumstances. In the case of #1 you should make the effort!

    | effectdigital
    0

  • Yes, our subject matter experts who write content have multiple degrees and belong to professional associations. The author bios were originally kept short, so I may have to fill them out a bit, but they're definitely the "real deal" when it comes to their expertise and credentials. Thank you for your insights!

    | SallieJ
    0

  • i tried to experiment my 2 post. the first one i do daily updates for the content and the other one i intent not to update since i publish the the article and i found out that the  second post has been rank in the google page. even the content has only 250 words. and the first one has 1500 words. This 2 post has the same target keywords.

    | invechseo
    0

  • There has long been a debate about whether it's easier to rank subfolders or subdomains and you'll find people on both sides of that argument. Usually inherent in that argument is that the branding and navigation will be the same across the whole site including subdomains. In your case though, you are basically building a separate site as a subdomain with its own branding, navigation, and URL structure. Even though subdomains are said to be treated as a separate site, we still feel like it's not quite the same as being a totally separate site, so our suggestion is that you will probably be better off building out that site on a new domain rather than a subdomain. We'd be curious to hear from others on this topic too.

    | Nozzle
    0

  • @Vijay Gaur I also wanted to add that we checked and haven't seen any issue with bot traffic

    | RS-Marketing
    0

  • Thanks so much, Darin!

    | SimpleSearch
    0

  • We have informational and retail websites where we put a LOT of effort into our content.   We are trying to produce the best-on-the-web.   All of this content is created and edited by people who have both formal education and deep experience in the content area. There is no way that we would allow user-generated content on these websites - even though we are not in a YMYL (your money, your life) type of industry.   User-generated content can be excellent, but a high percentage of it is deeply flawed and far, far below our editorial standards.   We have experience people in our own industry who want to submit content but we reject it because it is below our quality standards. The above is why we don't allow user-generated content based upon editorial standards. I have read information published by Google where they say that a vigorous comment section can be a sign of a quality website.  But, I believe that applies to content types where opinion, kibitzing  and prattle are acceptable.  However, medical sites (and other types of websites) are an entirely different matter.   Low quality content can result in problems for the reader - even if it is in a comments section.  Nobody knows exactly how Google views this, but I am going to protect my visitors from BS and poor-quality information.

    | EGOL
    1

  • Hi, Thanks a TON for all the analysis and insights. Just mind blowing info. Unfortunately we switched to different versions of the site and the recent one will be stable for years and further changes will be handled very carefully without complete transformation. Our open source crm  page dropped from April this year; but the link from capterra was removed in 2018 only. They removed our product from the list and they no more link directly to the websites (you can see the page now). Not sure why we lost traffic for this page all of a sudden even though there is no much ranking difference for main keywords of high search volume. We are going to investigate this and bring back the page to the normal traffic. Yes, we are trying to rank for "crm" as primary keyword. Do you think that we are not doing well for "crm" as we dropped for "open source crm" page? Thanks

    | vtmoz
    0

  • You are not alone: https://searchengineland.com/google-june-2019-core-update-finished-rolling-out-on-june-8-318028 Many at the moment are complaining including the Daily Mail and CNN. When I traced the Daily Mail's issue back to its root it seemed to mostly revolve around Google's E-A-T guidelines and the same could be true for your site too This may seem like old news: https://www.forbes.com/sites/jaysondemers/2014/07/18/google-quality-rater-guidelines-leaked-new-insights-revealed/#5c3f798e0bde - but recently there's been an influx of people complaining about ranking drops and one common thread seems to be E-A-T adherence (whilst another seems to be a lack of value proposition - https://www.youtube.com/watch?v=6AmRg3p79pM - just watch up until Issue #1 is outlined) Keep in mind that when quality rater guidelines were leaked in 2014, that doesn't necessarily mean that Google's algorithm(s) had adapted to incorporate those factors. It seems as if Google's algorithms(s) are now taking up more in this area (where previously a lot of it was left to Google's quality raters) We keep seeing people come on here with informational sites, blog sites, eCommerce sites - and the common thread right now is that most of the publishing-oriented sites seem to fail E-A-T whilst many commerce sites are failing to add a unique value-proposition to their arsenal A lot of people also seem very determined that updates to the Medic update are still, even now impacting webmasters (although this update is also pretty old: https://searchengineland.com/googles-august-first-core-algorithm-update-who-did-it-impact-and-how-much-303538 - the point is, when the 'core' is updated many of its internal updates and algos get re-aligned, rising or falling in prominence YMYL sites seem to be getting hit really hard across the board: https://searchengineland.com/quality-raters-handbook-your-money-or-your-life-177663 - I know, from 2013. But it seems as if a lot of this stuff around 'authenticity' of claims and statements, expertise, is really being scrutinised right now If your site revolves around cryptocurrency news then it probably qualifies as YMYL and may not be satisfying E-A-T enough, so I'd read a lot into those guidelines "3.2 Expertise, Authoritativeness, and Trustworthiness (E-A-T) Remember that the first step of PQ rating is to understand the true purpose of the page. Websites or pages without some sort of beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating. For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important. Please consider: ● The expertise of the creator of the MC. ● The authoritativeness of the creator of the MC, the MC itself, and the website. ● The trustworthiness of the creator of the MC, the MC itself, and the website. Keep in mind that there are high E-A-T pages and websites of all types, even gossip websites, fashion websites, humor websites, forum and Q&A pages, etc. In fact, some types of information are found almost exclusively on forums and discussions, where a community of experts can provide valuable perspectives on specific topics. ● High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a Copyright 2019 18 professional style and should be edited, reviewed, and updated on a regular basis. ● High E-A-T news articles should be produced with journalistic professionalism—they should contain factually accurate content presented in a way that helps users achieve a better understanding of events. High E-A-T news sources typically have published established editorial policies and robust review processes (example 1, example 2). ● High E-A-T information pages on scientific topics should be produced by people or organizations with appropriate scientific expertise and represent well-established scientific consensus on issues where such consensus exists. ● High E-A-T financial advice, legal advice, tax advice, etc., should come from trustworthy sources and be maintained and updated regularly. ● High E-A-T advice pages on topics such as home remodeling (which can cost thousands of dollars and impact your living situation) or advice on parenting issues (which can impact the future happiness of a family) should also come from “expert” or experienced sources that users can trust. ● High E-A-T pages on hobbies, such as photography or learning to play a guitar, also require expertise. Some topics require less formal expertise. Many people write extremely detailed, helpful reviews of products or restaurants. Many people share tips and life experiences on forums, blogs, etc. These ordinary people may be considered experts in topics where they have life experience. If it seems as if the person creating the content has the type and amount of life experience to make him or her an “expert” on the topic, we will value this “everyday expertise” and not penalize the person/webpage/website for not having “formal” education or training in the field. It’s even possible to have everyday expertise in YMYL topics. For example, there are forums and support pages for people with specific diseases. Sharing personal experience is a form of everyday expertise. Consider this example. Here, forum participants are telling how long their loved ones lived with liver cancer. This is an example of sharing personal experiences (in which they are experts), not medical advice. Specific medical information and advice (rather than descriptions of life experiences) should come from doctors or other health professionals. Think about the topic of the page. What kind of expertise is required for the page to achieve its purpose well? The standard for expertise depends on the topic of the page."

    | effectdigital
    0

  • Hi Have seen it impact many sites. Most down, a few up.  It is the continuation of the roll out or tweaking of the medic update.  So the first major update August 18, then March 19 and then the June one referenced. It was poorly named as the medic update - it really should be called the "entity update" or the "trump update".  Most are saying it is about content - it is far more complicated than that.  We have recovered a few sites from the August update.  If your business sits within "YMYL" verticals - then there is plenty to be done.  Usually, the largest job however in time, is a content audit, and then a re-write as the content usually coming up short on audit.  The technical elements are just as critical from schema markup to uniform citations etc. Not sure it helps, but your not alone.

    | ClaytonJ
    1

  • In my opinion - no, as Ifa as I understand Gooogle doesn't look too favourably at the, maybe it assumes that a page with no links from inside its own website has little or no importance.

    | jasongmcmahon
    1

  • I agree with Effectdigital - best method is to go to the Acquisition section and look at the data by source and medium - as well as confirming whether you are getting organic traffic, it means you can confirm where you are getting traffic from if it isn't from Google. In terms of your keywords question I couldn't say for certain why those tools aren't returning keywords but what do you see if you load your site with JavaScript switched off? Sometimes using JavaScript reliant sites can mean that tools like the ones you describe can't quickly pull content to get suggestions. Couple that with not ranking for terms that these tools may have already picked up and that could lead to what you're seeing. For what it's worth if that is the cause I'd consider server side rendering - the easier you can make it for machines to read your content, the better. Hope that helps.

    | R0bin_L0rd
    1

  • Hi Caroline, Recommendations would be to do as you have suggested review keywords, & pages. Make sure you have a robust About us section that proves you are experts, look to introduce new high quality and relevant content. Start actively searching for relevant and quality backlinks. Steve

    | MrWhippy
    0

  • The purpose of Search Quality Evaluators (SQE) is to ensure that the desired results of algorithm updates are being met.  They do not have a direct impact on your site.  A low rating by a SQE will not directly affect your site. What you may notice is that if your site is not meeting the guidelines and other sites from that search aren't either, your site may be affected by future algorithm updates to filter those types of results out.  But it won't be on a per site basis and will generally affect a particular type of search and not others.  Meaning that if you scored low on one type of result but high on another, the high one wouldn't necessarily be affected by the lower rating.  Again, that's because it's not on a site basis, but a search basis. I should mention for people reading this who aren't familiar with YMYL or EAT, that those stand for "Your Money or Your Life" and "Expertise, Authoritativeness and Trustworthiness" respectively. These are guidelines use by a SQEs to ensure certain types of sites meet higher standards. YMYL is covered in the Search Quality Evaluator Guidelines in Part 1: Section 2.3 EAT is covered in the Search Quality Evaluator Guidelines in Part 1: Section 3.2 http://static.googleusercontent.com/media/www.google.dk/da/da/insidesearch/howsearchworks/assets/searchqualityevaluatorguidelines.pdf

    | DarinPirkey
    0

  • I would do a little trouble shooting to see what some causes are. Check Dev Tools on Google Chrome to ensure you don't have "Disable cache" checked. Check your Robots.txt file to ensure that you aren't blocking Google (look for something like "user-agent: googlebot Disallow Look at Search Console for manual actions.  Go to "Security & Manual Actions" > Manual actions Let us know what you find and we can go from there.

    | DarinPirkey
    0