Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Moz Pro

Discuss the Moz Pro tools with other users.


  • What about the data that comes from Moz reports, like error counts and data points that are proprietary to Moz?  Do you guys just dump that data?

    | kwahlquist
    0

  • Sure thing.  Presuming your main site's blog is in WordPress, there's this handy-dandy importer: https://wordpress.org/plugins/blogger-importer/ There are instructions in the Installation section on how to export your existing Blogspot posts into an XML format that the importer can then read.

    | MichaelC-15022
    0

  • As I said on your other post, it's most likely Magento's wierdness. Make sure you understand how that CMS works and so on. Also, see where those pages are being linked from. Cause this is the way crawlers usually find those pages. When you find where it linked from, either delete or fix it. Cheers!

    | seomozinator
    0

  • Hey Erick, So it's tricky for me to answer your question definitively, but it's worth highlighting that both Moz metrics and Majestic metrics are simply there as a guide to relative page strength. Weak pages on strong domains frequently outrank stronger pages on weaker domains if that makes sense. Additionally, Google take many factors into account when ranking pages - not just links. As I mentioned above, from a user's perspective that page is a 'good result' for reviews; hence Google tend to rank pages like that well. As such I suspect you may struggle to outrank the page. If I were you, I wouldn't reach out to the blogger to ask them to delete the link. The danger is that (as you said) you start a whole new firestorm and attract a bunch more links to the ROR page. I suspect it's not that link that's making a difference in any case, so even if you did get it removed, it's likely that the page will still rank. Cheers Hannah

    | Hannah_Smith
    0

  • Definitely agree here, there isn't a "patch" or "quick fix" for this one. Get mobile friendly and focus your efforts on doing it correctly and in a timely manner and you should be able to bounce back over time.

    | Fuel
    0

  • It looks like your last line is just redirecting one page? This has worked for me to redirect them all to their corresponding pages on the new domain (*You'll want to test this first): <code>RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} !newdomain.com$ [NC] RewriteRule ^(.*)$ http://newdomain.com/$1 [L,R=301]</code>

    | Everett
    0

  • Hi Sean -- Can you clarify for me how competitors in a campaign figure in to the 50,000 page limit?  Does the main page in the campaign get thoroughly crawled first and then competitors are crawled up to the limit? Some examples: If the main site is 100 pages, and I pick 2 competitors that are 100 to 1000 pages and a 3rd gargantuan competitor of 300,000 pages, what happens?  Does it matter in what order I enter competitors in this situation as to whether the 100-page and 1000-page competitors get crawled vs. whether the limit maxes out on the 300K competitor before crawling the smaller competitors? If the main site is 300,000 pages, do any competitors in the campaign just not get crawled at all because the 50,000 limit gets all used up on the  main site? What if the main site is 20,000 pages and a competitor is 45,000 pages?  Thorough crawl of main site and then partial crawl of competitor? I feel like I have a direction to go in based on our previous discussion for the main site in the campaign, but now I'm still a little stumped and confused about how competitors operate within the crawl limit.

    | scienceisrad
    0

  • Hi, I can't recommend the ScreamingFrog enough! The free version has a limit of 500 pages, so you will need a license. With such a large site it it recommended to check on the crawl progress to ensure there are no crawl 'loops' If for example it turns out that a calendar on a page counts as a link for every day then the crawl will likely never finish! Luckily you can fix such things by creating a URL Exclude rule and the re running the spider. Kind Regards Jimmy

    | DSM_UK
    0

  • Very simple, Your old site http://www.instantissuance.com uses a 301 redirect which can be seen here http://www.internetofficer.com/seo-tool/redirect-check/ It then redirects to http://www.datacard.com/instant-issuance-solutions which is the exact page you mentioned above. In tools like Majestic and MOZ you can see that the most common term used in backlinks to the site apart from the domain name is "dynamic card solution". When you perform a 301 redirect all the positive signals that a site has are passed to its new home. This is a clear indication of that happening. I hope that makes sense!

    | gazzerman1
    0

  • Hi Kate - really appreciate you putting my mind at ease! 107 sounds about right, although I wonder if it's picking up some of the 'thin' category pages for example that WP creates. I really need to learn WP!

    | newstd100
    0

  • I would most definitely write it out as inch & inches and even in. when necessary. A simple Google search for intitle:48" will show you that Google ignores the inch mark on this search. Another search of intitle:" shows no results. Any search I can find using "inches" does not return a result with " in the title.A simple search for most keywords with synonyms or alternate versions shows the synonym or alt version in at least some pages. Inches does not. (Dinner shows dinner, dining & diner. Autos shows results for cars. Inches does not show " and a search for 10" shows results for 10, not "10 inches") So yes, I would use the full words in this case.

    | MattAntonino
    0

  • Hey Joris, I'm afraid that we aren't able to see what parameters you are adding to GWT, since we don't use that information in any way to crawl your site. We conduct a proprietary crawl of the site and report on exactly what your source code and server respond with. Patrick is right that adding the canonical tags to your site (or using the robots.txt file to block the campaign pages) would clear up those errors in our crawler. Since some dynamic URL generators can cause problems for crawlers, we do try to be overly-inclusive of these issues. We want people to know about potential issues with sites, even if they're not really issues in the scheme of the site owner's specific SEO implementation plan. In sum, we'd rather leave those judgments up to you. I hope this helps explain our thinking here!

    | ChiarynMiranda
    0

  • Hello. Thank you for your detailed reply. To answer your points: I don't think the exension of the domain will make any difference at all to our customers. Yes a ,com might be better from an international point of view but this is not a major point at the moment. I understand that no exension will make a difference. I think though that the point for me was that we were ranking fairy well for the .com but the .co.uk has never performed as well. The last SEO company did say this was due to Google's updates but I heard that excuss nearly everytime something didn't work out. Really is there any way to tell if changing back to the .com would benifit or does google pass all the link juice over with a 301 anyway. Paul

    | UKHost4u
    0

  • Hey there Moz actually has a great step by step guide to help you with deciding what content can be updated, removed, or consolidated. You can read more about that here. It covers everything from traffic, relevancy, shares, linking and categorization and more. Super comprehensive. Hope this helps! Good luck!

    | PatrickDelehanty
    0

  • Hi Paul! Our tool has a 90% tolerance for duplicate content, which means it will flag any content that has 90% of the same code between pages. This includes all the source code on the page and not just the viewable text. You can run your own tests using this tool: http://www.webconfs.com/similar-page-checker.php. In the case of http://www.ukhost4u.co.uk/blog and http://www.ukhost4u.co.uk/blog/author/trafficsource, these pages are 98% similar, which is why they're being flagged. We have a great article about best practices for fixing Duplicate Content here: https://moz.com/learn/seo/duplicate-content Thanks! Kevin Help Team

    | kevin.loesken
    1

  • To add to Patrick's great response, this page describes what you are seeing. When you try to go to a page on your site that does not exist, you seem to correctly get an error page. But the content of the page does not necessarily reflect the HTTP response returned by the server. Your server is saying that the page was found, although the content of the page says it wasn't found. [A 404 response from the server simply means Not Found.] If you look in your Google Webmaster Tools under Crawl > Crawl Errors you will see a tab called Soft 404. A soft 404 is a page that says it is not found but is not returning an actual 404, not found, response. [I always wondered how Google could tell--I was given the answer that if Google sees that a whole bunch of different URLs bring up the same page, which does not return a 404 response, it assumes that that page says it is an error page and labels it a soft 404.] If you want to test whether a URL on your site is correctly returning a 404, you can use Fetch as Google to see if it returns Not Found or else there are a number of free server response checkers online.

    | Linda-Vassily
    0

  • Thanks so much for all if this wonderful info! I am actively working on all of these items! I'll let you know

    | NutcrackerBalletGifts
    0

  • awesome, awesome answer here man. thanks for taking the time to respond. I went in and setup WMT for the old sites and things are looking a little bit better now.

    | BrianJGomez
    0

  • The first two responses are typical of what I've seen on this cult--er, community. Thanks to Keri for answering the question so I can cancel and get back to ahrefs.

    | ecommerc
    0

  • Thanks Joe - I'll check the settings right now for NextGen and see what I find. I appreciate it. Travis

    | 2Spurs
    0