Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi there, It's not the nicest task in the world, but unfortunately, it's best to manually review websites before you add them to a disavow file. You can use metrics such as Moz spam score to filter them and start with the worst score which may speed up the process a bit. You could also try and prioritise multiple URLs that are from the same domain. For example, if you have 50 URLs linking to you from one domain which looks suspect, you probably only need to review a few of those URLs in order to understand the quality/type of link they are and take action. I'd avoid putting a hard rule in place such as disavowing anything below a certain metric score. You may accidentally disavow links which are perfectly fine. There is also some debate as to the impact that the disavow tool has unless you have a manual or algorithmic penalty in place. If you don't see any evidence of a penalty, my advice would be to only disavow links which are clearly low quality/spammy. Hope that helps! Paddy

    Technical SEO Issues | | Paddy_Moogan
    0

  • This is exactly the right answer! Also remember that unless the content at the redirect origin and destination URLs is similar, Google may decide not to transfer the SEO authority across. So if you looked at the last active iteration of the old URL which 'earned' the SEO authority (and links), and you compared the content (via something like a Boolean string similarity tool) to the new URL - and the read-out wasn't good, you might lose a little (or all of) your SEO authority The best thing to do is actually get the backlinks amended, if you possibly can - as this circumvents the whole problem. That being said it can be time consuming to do link amends and what you actually get back can be iffy (some webmasters can even get annoyed if not approached correctly)

    Intermediate & Advanced SEO | | effectdigital
    0

  • That is just not the case. I saw the same from google, but disavowing globe domains helps in a big way.

    White Hat / Black Hat SEO | | samdland
    0

  • .... each of your pages/articles would still be focussing on a single keyword (or keyword cluster) while bearing in mind the overall goals. You are 100% correct.  They are all about the same long tail keyword with even longer long tail keywords as variants. You wouldn't have two pages competing for "Stihl Chainsaw MS170 Maintenance" for example, but you might have multiple pages that are talking about "Stihl Chainsaw MS170" in various ways, all probably linking to the page where the customer could buy the actual product or parts, etc. You are right.  And, those pages linking to one another will support the attack on "Stihl Chainsaw MS170"... and all of them plus all of the pages for other models will support the attack on "Stihl Chainsaws".

    On-Page / Site Optimization | | EGOL
    2

  • Firstly, I would definitely take the opportunity to switch to SSL. A migration to SSL shouldn't be something to worry about if you set up your redirects properly, but given that most of your pages aren't indexed at all, it is even less risky. You will eventually get the traffic back, as far as how long, it's very difficult to say. I would concentrate on crawlability, and make sure your structure makes sense, and that you aren't linking any 404's or worse. Given the size of your site, that wouldn't be a bad thing anyway. From your description of your pages, I'm not sure there is any "importance hierarchy", so my suggestion may not help, but you could make use of Google's API to submit pages for crawling. Unfortunately, you can only submit in batches of 100 and you are limited to 200 a day. You could, of course, prioritise or cherry pick some important pages and "hub" pages, if such things exist within your site, and then start working through those. Following the recent Google blunder where they deindexes huge swathes of the web and, in the short term, the only way to get them back in the index was to resubmit them, someone has provided a tool to interact with the API, which you can find here: https://github.com/steve-journey-further/google-indexing-api-bulk

    On-Page / Site Optimization | | Xiano
    1

  • I've attached some data for the bounce rate and time spent. I segmented by New users, as existing users I'm sure would skew the stats. The Search page wins out on both. As for the freshness of content, the search page wins again, just by the nature of the content, with new members signing up frequently. I don't really trust GA's page speed metrics; from my tests the too appear comparable with a slight edge again towards the Search page. I suppose if users are visiting the homepage and then realise they cannot get to the homepage, it could contribute to the bounce rate (and then they may click on the Search result listing). Alternatively though, if Google see users typically going to the Search page immediately, is it more likely to rank that higher to cater to that experience? Ra0WLAr

    Intermediate & Advanced SEO | | andrew_uba
    0

  • Why are these links not being clicked?  Are they on low quality websites that nobody reads?  Or are they about topics that nobody is searching for?  Or is the content so low that people quit reading. Each of these has a solution.  Target better websites, write what people really care about, increase the level of quality. Now, if you are going to put all of that effort into it, then perhaps the website that you should be targeting is your own.  Filling it full of high quality articles on topics that people really care about and are searching for. Common wisdom in the SEO community is to spend time building trivial content in the form of guest posts that will produce links that nobody clicks on.   I don't get it.  I'd rather spend my time building a real website that I can be proud of.  Bet on yourself.

    Intermediate & Advanced SEO | | EGOL
    0

  • Thanks Alex; this is really helpful insight.  Lots to think about!  Thank you again - I sincerely appreciate it!

    Intermediate & Advanced SEO | | katelynroberts
    0

  • Hey guys, It does give an error: **Moz was unable to crawl your site on May 3, 2019. **Our crawler was banned by a page on your site, either through your robots.txt, the X-Robots-Tag HTTP header, or the meta robots tag. Update these tags to allow your page and the rest of your site to be crawled. If this error is found on any page on your site, it prevents our crawler (and some search engines) from crawling the rest of your site. Typically errors like this should be investigated and fixed by the site webmaster. Read our troubleshooting guide. With that in mind, it does not appear that the crawl should be blocked with the resources in place.

    Getting Started | | DrainKing
    0

  • If you no longer carry that specific product, you can redirect the dead URL to a collection related to that product on your website. If there is no related collection, then redirect it to your homepage. Read Shopify's instructions on how to do a redirect in there platform here: https://help.shopify.com/en/manual/migrating-to-shopify/considerationsd

    Web Design | | Nozzle
    0

  • In response to your second question, it's fine to have /usa/ although /us/ or /en/ would be a more typical deployment (lots of people go like, /en-us/ and /en-gb/ as that structure allows for really granular international deployment!) As long as the hreflangs are accurate and tell Google what language and region the URLs are for, as long as the hreflangs are deployed symmetrically with no conflicts or missing parts - it should be ok Note that Google will expect to see different content on different regional URLs, sometimes even if they're the same language but targeted at different countries (tailor your content to your audience, don't just cut and paste sites and change tags and expect extra footprint). Stuff like shipping info and prices (currency shown) should also be different (otherwise don't even bother!) Your hreflangs, if you are doing USA as your EN country, should not use 'en-gb' in the hreflang (instead they should use 'en-us') If you're thing God the HTML implementation will make the code bloated and messy, read this: https://support.google.com/webmasters/answer/189077?hl=en There are also HTTP header and XML sitemap deployment options (though IMO, HTML is always best and is the hardest, strongest signal)

    On-Page / Site Optimization | | effectdigital
    0

  • Thank you for your suggestion! The thing is last year Google has blocked all advertisers who ran cryptocurrency/ico-related ads (and us too), so we have no opportunity to run PPC ads for our brand name. Luckily, no competitors run 'bad' ads targeting our brand name as well.  I guess it's really the 2nd option that's realistic for us - to push the first query-space to look cleaner. Again thanks for your helpful ideas!

    Branding / Brand Awareness | | MariY
    0

  • A page which sends another URL PageRank does lose some of its own PageRank in the process. If you have 8 pages under a homepage and they all recieve PageRank, and then you add more pages which are also linked to from the HomePage, then all of the URLs will get a relatively equal amount (but since more URLs are being linked to, each linked page gains slightly less PageRank than before!) If you don't want those pages to rank on Google or be passed PageRank (they are purely landing pages for referral or ad traffic) then you can orphan them if you want. Shouldn't be a huge deal and would keep things isolated. Those new pages, once orphaned - won't rank very well though (on Google's normal, organic results)

    Search Engine Trends | | effectdigital
    0

  • Hi there, Sam from Moz's Help Team here! Could you pop a message over to help@moz.com about this so we can ensure everything is running smoothly for you? Thank you!

    Moz Tools | | samantha.chapman
    0

  • Thank you for the responses! Because the actual content on the page isn't seen as duplicate (or anything close using boolean string similarity tools) I think I am just going to differentiate the metadata and try to focus on improving the content on the organic pages. Thanks for your help!

    Intermediate & Advanced SEO | | Wavelength_International
    0