Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Link Explorer

Cover all things links and the industry-leading link data discoverable in Link Explorer.


  • Hello Jacob, Thanks for reaching out, Jo from the Moz support team here. I've investigated your example and it looks like Accessories?category=beach-towels has 15 duplicates with canonical tags pointing to different URLs.  For canonicals to work they need to point to the same URL. I've exported the csv from your Moz Pro Site Crawl and located your example URL below: https://www.havaianasaustralia.com.au/Accessories?category=beach-towels From here I've scrolled over to the Duplicate URL column and grabbed those URLs then checked the page source for the rel=canonical tag. Here are a couple of examples of what I've found: It looks like this page https://www.havaianasaustralia.com.au/Accessories?category=keyrings Has this tag rel="canonical" href="http://www.havaianasaustralia.com.au/Accessories"/> This page https://www.havaianasaustralia.com.au/Accessories?category=towels Has this tag rel="canonical" href="http://www.havaianasaustralia.com.au/towels-havaianas-accessories"> So the page Accessories?category=beach-towels will be picked up as having duplicates as those 15 pages have 90% of the same code between pages. This includes all the source code on the page and not just the viewable text. You can also run your own check using this tool: http://smallseotools.com/similar-page-checker/ I hope this helps, if there is anything else I can help you with please do let me know Cheers! Jo

    | jocameron
    0

  • Hi there! Jo here from the Moz support team. Spam score is made up of 17 unique signals identified for that subdomain content.yudu.com, rather than that page. I wouldn't really describe the Spam Score flags as an algorithm, but rather flags that are triggered with the gathering of data for our Mozscape index. You can check out all the flags and the ones that were triggered for that subdomain here https://moz.com/researchtools/ose/spam-analysis/flags?subdomain=content.yudu.com In addition to the Whiteboard Friday that Alick300 shared above I also recommend reviewing this article by Rand on Spam Score https://moz.com/blog/spam-score-mozs-new-metric-to-measure-penalization-risk I hope this helps, if there is anything else I can help you with please do let me know Cheers! Jo

    | jocameron
    0

  • Hi Kristina, Thanks for the reply, but my question was either poorly worded or misunderstood. I've edited it. I'm not looking for an exportable list of all the backlinks to my site and where they come from. I'm looking for an exportable list of all the pages on my site, with their respective Page Authority (PA). If there was an additional field of data (such as how many backlinks are directed at those pages, that would be a bonus, but not not required).

    | micromano
    0

  • Thanks Tawny, It has been well over a month and I am not new to using MOZ but simply trying to figure out how long the latency is... I will read up on the API. Thank you for the information.

    | seanallen007
    0

  • Hey there! It's less an issue of a bug in our system and more a factor of how the index was formed this time around. We expect data to normalize and stabilize after our next index update.

    | tawnycase
    0

  • Hey there! Tawny from Moz's Help Team here! The only way to make bulk checks in Open Site Explorer would be to use our Mozscape API to do those bulk checks! You can read more about the API over in our Help Hub pages, here: https://moz.com/help/guides/moz-api/mozscape/overview It's pretty technical, so you'd likely need a web developer to help you decipher the results you get back. I hope this helps! If you've got questions about the API or how to use it, feel free to send us a note at help@moz.com and we'll sort things out for ya!

    | tawnycase
    0

  • Hi Sam, First of all, congrats on the lovely site DON'T PANIC! OSE does not represent live results as it is linked to the Mozscape Index, which updates about once a month (There was on update on Jan 26th and the next one is due around Feb 28th). Discussed here: https://moz.com/help/guides/research-tools/open-site-explorer/updates For latest and pending update information, go here: https://moz.com/products/api/updates If you want to do a quick check to see whether your site is showing for both www and non-www variants, do a search for: **site:sassandgrace.co.uk **(with no spaces) and you'll see what Google has indexed. Interestingly, there is one reference within Google's results  for the www. version: **www.sassandgrace.co.uk/blog-detail.html **which is a remnant from development as it contains lorem ipsum text. If the redirects have been created manually it's easy to see why this would be overlooked. Best practice would be to create a global redirect rule from the www to non-www. version of your site. Read this for some solid best practices when dealing with redirects: https://moz.com/learn/seo/redirection You can also indicate that non-www. version is your preferred version of the site in Google Search Console. That stray page aside, I'd have no concerns. Check back in OSE after the next update and see if everything hasn't settled down. Good Luck!

    | Hurf
    0

  • Hey there! Tawny from Moz's Help Team here. Sorry to hear about your DA dropping after our index update! That's a bummer! Rand does a really good job of breaking down what went differently with this month's index and why it affected DA scores the way it did in this thread: https://moz.com/community/q/is-everybody-seeing-da-pa-drops-after-last-moz-api-update#reply_359883 Take a gander and see if that helps answer why your scores have dropped. If you've still got questions, feel free to shoot us a note at help@moz.com and we'll do our best to answer them all.

    | tawnycase
    0

  • Hurf - Thanks for looking into this - Yes I'm running Integrity for mac which is a link scanner - That's where I got the broken link image that I attached to the this ticket.

    | NCCompLawyer
    0

  • Hi Ria, thanks for your reply, yes i am pruning out the bad pages, however i am looking at many factors not just PA. So seems i am on the right track and will continue Gemma

    | acsilver
    0

  • Start by getting things in order in Google Search Console. You should add both non-www. and www. properties to your site. When both are verified you can set your preference (in this case, non.www.) Otherwise, "If you don't specify a preferred domain, we may treat the www and non-www versions of the domain as separate references to separate pages" and risk issues down the line with duplicate content. See here for details about setting your preferred domain: https://support.google.com/webmasters/answer/44231 If, as I read your follow-up post, you're saying all of your links point to the www. version of your site then this could be the source of your problems. You need to organise your domains within Google Search Console. Then determine where all of the inbound links and authority are headed and set your preferences within GSC accordingly. You can use 301 directs when things have settled down but don't rush and make mistakes as a result. You can work through GSC documentation here, then follow the appropriate action for your scenario. https://support.google.com/webmasters/answer/34592?hl=en I hope that helps and good luck!

    | Hurf
    0

  • Hey there John! Sam from Moz's Help Team here! The URLs that appear in Top Pages are there because we've found the most external links to them. Although the pages are no longer on your site, those links (using the previous and now redirected URLs) still appear on other sites. Top Pages is simply counting up the links we found to you, to show the most frequently used URLs. I hope this helps to clarify! You can always check the status code for a given URL by plugging it into a third party HTTP status checker like https://www.hurl.it/. Let me know if there's anything else I can help with!

    | samantha.chapman
    0

  • Thanks Gaston, this makes so much sense! Appreciate your added insight!

    | karrabarron
    0

  • A search for the www subdomain will likely pull up very similar results to searching for the non-www root domain, but it's still possible that the results will be different, depending on how other sites are linking back to yours. You can always use the filters to see links pointing at the root domain level!

    | tawnycase
    0

  • Hey there Matthew! Sam from Moz's Help Team here - would you be able to pop a message over to help@moz.com so we can take a deeper look into this for you with the name of the URL and the campaign you're seeing this occur with?

    | samantha.chapman
    0

  • Hi Rob That's a big sway in favour of shifting to the subfolder, and will absolutely take you up on your offer via PM, thanks very much

    | Warren_33
    0

  • Hey there! Sam from Moz's Help Team here - thanks for writing in with a great question! This is definitely due to the way we collect this information for our link index and it is actually expected that we won’t find every page and link because we aren’t looking for them all! When we collect this data, we’re looking specifically for the most valuable links and, rather than crawling your entire site or every site, we collect this by starting our crawler on a few highest ranking sites and letting it perform a breadth first search to see what it finds. For each page that we crawl, we first collect each of it’s links before following these and collecting the details of each page that these link to and so on. There’s a set limit of links that we’ll crawl per page and pages that we’ll crawl per site so it’s expected that we may not follow every link on a site this way. Just a few points on how exactly how we compile our index: We grab the most recent index. We take the top 10 billion URLs with the highest MozRank (with a fixed limit on some of the larger domains). We start crawling from the top down until we've crawled ~130 billion URLs The idea here is that we're focusing on the highest-quality links we can find, coming from the most prominent pages of authoritative sites. So, while you may not see every link for a site within our index, we're aiming to report the most valuable ones available. Most new sites and links will be indexed by our spiders and available in Mozscape and Open Site Explorer within 60 days, but some take even longer for many reasons - including the crawl-ability of sites, the number of inbound links to them, and the depth of pages in subdirectories. Since Moz focuses on quality of links over quantity, we are always focused on the most relevant links to display to our users. It's possible that Moz's index will leave out some of the lower-quality (non-link juice providing) links out of our index because of this. So, that might explain why you may see some discrepancies with what other tools may be showing. If you’re looking for a tool to show all of your backlinks, we might not be the best fit for you but we do hope to show you all of the most valuable backlinks. I hope this helps - let me know if you have any further questions!

    | samantha.chapman
    0

  • Thanks for the response. It does make sense but I will also send the site in question just to make sure you can the best idea of what I mean. Thanks

    | TMI
    0