Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Thank you very much Julie, I really appreciated your words. I have wondered so many times what Google think of "quality", and why before us there are always very low quality websites distributing the exact same music for free (often copyrighted music, which is illegal) and most of those sites are full of ads. Is that quality? We could open a new discussion thread on the "What is quality to Google?" topic, I think it'd be very popular! Thank you again.

    | fablau
    0

  • "If you use the hreflang, Google won't consider the alternate URLs as duplicate ever ;-)." I wish that was true. That would solve a lot of the .co.uk vs. .com problems that sites have where they want a different URL to show up in the SERPs without actually having to change their English content. But the reality is Google will consider it duplicate content even if you use hreflang. I have blogged about this in detail with examples: https://hreflang.org/hreflang-and-duplicate-content/

    | NickJasuja
    0

  • "_Could it be dupe content from the W_ikipedia pages we imported and indexed?" That's not a good idea. I'd either point to the Wikipedia pages themselves, noindex or canonical them (back to their source). Also, agree with Clayton John. It could be any number and/or combination of factors. Glenn Gabe, president of G-Squared Interactive, watches website traffic trends and fluctuations and comments upon possible root causes. He remarked back in November that "The fall of 2016 has been one of the most volatile ones I have seen in a long time algorithm update-wise" and observed that Google is testing its new mobile-first ranking algorithm. Check out that article to get some other suggestions as to possible root causes.

    | DonnaDuncan
    1

  • This might be a whole new thread, but how does an internal link work if you are linking to a page that is already in the main menu of the page you are linking from. A top level product page for example. Is it worth creating anchor text links if they will be the second link discovered by the spiders?

    | bittristo
    1

  • If we are talking about content in a table as a means of representing data. I believe Google won't care too much about how your content in presented, however, what will matter is how your users react to the layout. As a rule of thumb, always go with what is best for the user. Ask yourself what is visually more appealing and easier to get your point across.

    | thmsmrrtt
    0

  • Hi Marie, Thanks for such detailed answer. Actually I have recently disavow a website from which 74 links are pointing to our website. So that domain must be crawled post my disavow for the impact. Right? I wanna share our experience with related to back-links. Our global website has visitors around the globe. We been inversely ranking in India and US when we reclaimed and removed links from few domains. In more detail: We dropped in India and Improved our ranking in US when we reclaimed some back links and we dropped in US and improved in India while we removed those redirects. So it clearly mean that one or few links can trigger the algorithm that our ranking fluctuated for more than 15 positions. So such few or more suspicious back-links might be pushing us down. In this scenario, we can use Disavow undoubtedly if we know such links. We never received any manual action but not sure about manual penalty or silent penalty, but definitely we dropped post recent Penguin update where some of the back-links are real culprits. Hope I shared some useful information to you. Thanks, Satish

    | vtmoz
    0

  • Given this situation, that's exactly what I'd be doing. John's advice does appear to be true; you won't be penalised for duplicate content but your rankings won't be great either. Theoretically, Google should have no issue determining the fact that site A was the original source of that content and so it should suffer no ranking change at all. Canonicalisation is the path I'd be taking to hedge bets there and mitigate risk as much as possible. It's still not guaranteed to be safe (Google may choose to ignore canonicalisation) but it's about as close as you're going to get. Hopefully the client doesn't drag their feet and take 2 years to get that content production started

    | ChrisAshton
    0

  • I use www.blizzardmetrics.com to analyze bounce rates from a benchmark standpoint. Here is some data from November 2016 for 135 websites - Overall Bounce Rate 39.7%, Mobile Bounce Rate: 43.1%, Tablet Bounce Rate 53.7, Desktop Bounce Rate 39.7, Bounce rate from Organic Search Engines 38.4%, Bounce Rate from Direct Type-in: 59.9%, Bounce Rate from Referrals 28.7 (something fishy here, it was 54.1% in October and 69.4% in Nov 16), Bounce Rate from email 61.4 and Bounce Rate from Social 35.3. This data is dynamic, if you head over to Blizzardmetrics, and add your site, all the numbers will update! If you are an agency and add a bunch of website, you can look at JUST your websites, or, all websites. You can also categorize by industry.

    | TrentBlizzard
    0

  • Hi Becky! Do these responses help to answer your question or are you looking for more information? If you're good to go, please mark this as answered. Thanks!

    | MeganSingley
    0

  • Thank you, I will try to add links in the PDFs and see if that helps at all.

    | Neverstop123
    1

  • We will definitely employ the 301 redirects once we completely audit the HTTPS site for any absolute links. I just wanted to ensure that we wouldn't have two sites in the index during the transition. Thanks for the feedback.

    | AMSI-SEO
    0

  • TL;DR: You're right to be skeptical that this is an urgent issue (in my opinion), but it is something worth fixing at some point for several reasons. I was far more concerned by search results, but I see you've added those to noindex/disallow in robots.txt, which is great. Not many people know that works! I think it's very possible that Google understands the difference between a classified ad and an editorial content piece. They definitely treat products and content differently. That said, it's generally a good idea to avoid relying on Google's intelligence, as many have been let down by Google's failure to understand. Duplicate content is generally something SEOs are overly-concerned with. More often than not it triggers a filter - not a "penalty." I don't see it as the most dangerous thing you could be doing by any stretch of the imagination. That said, I've seen several classified sites do the following, which I'd recommend as a "best practice" approach. At one time Craigslist did this, and may still be doing it. Accept non-spam ads with a pending status Check against listings in a given period of time for duplicates. This happens even if the ad is changed slightly, so there's some kind of semantic+image analysis going on. If a duplicate is found under the same user name, inform them that they've already posted the ad. From here the rules are up to you. Many sites say the ad can't be posted again for 7 days (if the old ad is deleted) or 30 days (if not). They then encourage users to buy a featured listing that shows up higher than others. If duplicates are found under different user names, give a warning that it's against your terms of service (make sure it is) to post duplicate ads from multiple accounts, that accounts can be banned, and have them certify the post is not the same. You don't need to follow this exactly, but it's here to give you some ideas on having your users prevent duplicate content for you. Given the general positive architecture I've seen on the site it looks like you know what to do with the site better than I would. Now I don't think 250 out of 10k is bad. Having consulted with a few local classified sites that's actually quite low. But I do think there's something to be gained by detecting duplicates to prevent users from gaining an unfair advantage over those playing by the rules. And if you sell featured listings this is an excellent way to help those who are most desparate to sell while increasing revenue. I hope that helps. Obligatory disclaimer: This is merely free advice for your consideration, and not the Moz official stance. The consequences of any changes you do or don't make are ultimately your responsibility.

    | Carson-Ward
    1

  • I think I'd have to request these. I know it's something I need to look at, but I;m not sure how high a priority I should put on it. Do you think it would make a huge difference if they were stopped from being crawled?

    | BeckyKey
    0

  • Hello! Glad to hear the project should soon be progressing! I haven't worked with automated translation in a while, so I'm not sure how good an automated solution would be nowadays, but a native-speaker editor at the least is probably necessary (and perhaps optimal if the jargon is particularly unusual). A story for an example: my father is a swimming coach and was teaching some Arabic speakers, so he tried to translate the training session (full of swimming jargon) into Arabic. Everything went surprisingly well, besides the translation of "distance per stroke." The tool didn't know that "stroke" in swimming means the same thing as "stride" in running, and the translation read "distance per heart attack."

    | JaneCopland
    0

  • Hi there, I hate to say this, because it's not a great answer, but it really does depend on a bunch of things. Generating links naturally i.e. links as a result of someone linking to you without you asking, is generally quite difficult unless you have a very, very good piece of content. But when this does happen, I've seen content pieces get anything from 10 - 10,000 links with the majority of them being natural. On the other hand, I've worked on content pieces where I've had to work really hard to get 10 links using outreach. Sometimes, you work really hard and get nothing! If you're just starting out on this path, you're unlikely to get links naturally from day 1 and will need to work hard to get links via outreach. But you should find that over time, it becomes easier and eventually you get to a point where you're not having to scrap for every single link. It isn't easy, but certainly possible. I hope that helps a bit! Paddy

    | Paddy_Moogan
    1

  • "My theory is that the uplift generated by the internal linking is subsequently mitigated by other algorithmic factors relating to content quality or site performance or..." I think your initial analysis of the situation is right. Look to improve user-experience, conversion rates and interactions on those pages and try your experiment again. I don't like using bounce rate as a metric for this for several reasons, but if you use Time On Site, Pages Per Visit, or track interactions, such as when they scroll past 50% of the page or click a button... There are plenty of ways to gauge whether your changes are providing a better experience for visitors from search results, which in turn should be roughly the same thing that pleases the algorithm.

    | Everett
    1

  • Yeah, unfortunately, that's core Knowledge Graph (not a Featured Snippet), so it's coming from a partnered data source. I'm not sure where they're pulling this from, though, as the Wikidata/Wikipedia entries seem correct: https://en.wikipedia.org/wiki/Free_Shipping_Day There's not even a "report" option on this one, so I'm afraid Alick300 is right - you're going to have to contact Google directly, I'll ask around on Twitter, but Google isn't transparent at all about these data sources.

    | Dr-Pete
    0

  • Glad I was able to help. Keep me posted on your progress, especially with the homepage.com/keyword-here/ page.

    | SurgeStream
    0