Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Getting Started

Get up and running with the Moz tools.


  • It appears that Zenfolio really doesn't want Moz crawling their site for some reason.  They want me to jump through hoops to get it to work and, being an ultra small business, cant justify the effort to get it working. Guess I could always migrate to another web host. Thanks for the replies.

    | bpenn11
    1

  • In the main rankings section of your Campaign, the related URL is automatically populated with the highest ranking URL for that particular keyword- this URL can't be updated as it is pulled directly from the SERP at the time of the data collection. That being said, if you'd like to see how a particular page is ranking for that keyword, there are a few ways you can see that info. The first would be in the Analyze a Keyword section. Here you can select "keyword performance" from the drop down to see the top 5 ranking URLs for that keyword on your site. The next would be in Page Optimization. Here you can create a pairing of a keyword and URL from your tracked site and we will show you how well optimized that page is for that keyword along with the rank. The rank will then update weekly and will be able to be seen in your Page Optimization's "track and monitor" tab. I've got a few guides to help get you started with these tools: https://moz.com/help/moz-pro/page-optimization/overview https://moz.com/help/moz-pro/rankings/analyze-keywords

    | dave.kudera
    0

  • This is true, but interestingly you can still pull MozTrust and MozRank for URLs using the Mozscape API. To do that you need an active Moz subscription and a tool like URL Profiler. I'm not saying it's a good idea to utilise deprecated metrics, in fact the truth is quite the opposite and (as per Eli's response) I'd steer clear However, the question asked how to fetch these metrics and there are still ways to do so. It's just inadvisable

    | effectdigital
    1

  • Hi there! To check the Spam Score, you'll just want to head to Link Explorer and plug in the URL there, then navigate to the Spam Score tab on the left. https://analytics.moz.com/pro/link-explorer/home You can learn more about Spam Score here: https://moz.com/help/link-explorer/link-building/spam-score Let us know if we can help with anything else!

    | samantha.chapman
    0

  • Hi there Liz! You can check on this by heading to your Rankings tab within your campaign, and in the dropdown that appears, clicking on the 'SERP Features' tab. You can read more about this here: https://moz.com/help/moz-pro/rankings/serp-features Let us know if we can help with anything else!

    | samantha.chapman
    0

  • Thanks for your responses Maureen From what I know, sometimes when you alter your site to be 'faster', you sometimes have to wait a few days for that to start reflecting in the page-loading speeds. I am pretty sure that, if you have server-side caching enabled, and resources have been cached (previously) non-compressed, then sometimes the old resources will continue being served to people for days (or even weeks) after alterations are made This is certainly true of image compression (where the old JPG / PNG files continue to be served after being replaced with more highly compressed versions, since the cache has not refreshed yet) - I am unsure of whether that applies to GZip compressed files or not (sorry!) From what I understand, page-speed optimisation is not a straightforward, linear process. For example many changes you could make, benefit 'returning' visitors whilst making the site slower for fir-time visitors (and the reverse is also true, there are changes which take you in both directions). Due to these competing axioms, it's often tricky to get the best of both. For example, one common recommendation is to get all your il-line (or in-source) CSS and JS - and place it in '.css' or '.js' files which are linked to by your web pages Because most pages will call in the 'separated out' CSS or JS files as a kind of external common module (library), this means that once a user has cached the CSS or JS, it doesn't have to be loaded again. This benefits returning site-users. On the flip-side, because external files have to be pulled in and referenced on the first load (and because they often contain more CSS / JS than is needed) - first time users take a hit. As you can see, these are tricky waters to navigate and Google still doesn't make it clear whether they prefer faster speeds for returning or first-time users. In my experience, their bias floats more towards satisfying first-time users Some changes that you make like compressing image files (and making them smaller) benefit both groups, just be wary of recommendations which push one user-group's experience at the expense of another For image compression, I'd recommend running all your images (download them all via FTP to preserve the folder structure) through something like Kraken. I tend to use the 'lossy' compression algorithm, which is still relatively lossless in terms of quality (I can't tell the difference, anyway). Quite often developers will tell me that a 'great' WordPress plugin has been installed to compress images effectively. In almost all cases, Kraken does a 25%-50% better job. This is because WP plugins are designed to be run on a server which is also hosting the main site and serving web-traffic, as such these plugins are coded not to use too much processing power (and they fail to achieve good level of compression). I'm afraid there's still no substitute for a purpose-built tool and some FTP file-swapping :') remember though, even when the images are replaced, the cache will have to cycle before you'll see gains... Hope that helps

    | effectdigital
    0

  • Hi Georg! Not a problem! Moz's Spam Score is the percentage of sites with similar features we've found to be penalized or banned by Google (it's not based on the spam score of the sites linking to you). To improve this score I would recommend reading our guide which explains the 27 factors used to make up this score (we don't identify which ones specifically are affecting your site within the UI). You can then look at your site and investigate areas you would like to improve on your site: https://moz.com/help/link-explorer/link-building/spam-score 

    | samantha.chapman
    1

  • Hi there, Sam from Moz's Help Team here! Sorry for the delay! Keyword Explorer metrics can take up to 24 hours to build in a list, although usually it doesn't take nearly that long. If those metrics are still not showing tomorrow, could you please pop an email over to help@moz.com - a screenshot of the list would be awesome, and we can dig into this further. If you have any other questions, please feel free to ask away as well!

    | samantha.chapman
    0

  • Ed had a great answer. Make sure you have compelling, original content that is optimized for a potential query. Try to make it so valuable that people will want to link to it.  Focus on good internal linking to these pages and make sure that your site is responsive and fast. If you can accomplish this, you will see increases in PA/DA. Good luck!

    | KevinBudzynski
    0

  • Hey there! Looking at your attachment, it seems that those sites are linking to nonexistent subdomains on your site. If you evaluate these links and decide that they are harming your site, you can disavow them using Google's disavow tool. As a general rule, if you see some weird links you probably do not need to be disavowing links. You can read more about disavowing links here: https://moz.com/blog/do-we-still-need-to-disavow-penguin I hope that helps!

    | moz_support
    1

  • Thank you I responded via email.  Since you have improved the data so much, does this mean that going forward you can look at legacy data, ir will it continue to be limited to 12 months.  I am having to track my own data so that I do not lose it. Chris

    | cptutty
    0

  • Replying to question.

    | SergioMejicanos
    1

  • Hi Meghan thank you so much for your help, I will try with this step by step and keep in contact Best regards

    | ceciliaosio
    1

  • Hey there, thanks for reaching out to us! Entering a keyword either nationally or with a specific location will have different rankings.  Its really up to you how you want to track those keywords.  If you are unsure, you can also track a single keyword both nationally and locally, it will just count as two keywords towards your limits. Hope that helps, let me know if you need anything else.

    | dave.kudera
    0

  • Hey, Thanks for reaching out to us! You can create a Campaign solely for a subdomain or subfolder by selecting the +Advanced setting in the Campaign set-up; just click the check box there and it will limit our Campaign audit to the pages on that specific subdomain or subfolder. From there, you can see in your Campaign Setting if you've set it up for just that chunk of your site, or for the entire root domain. I've got a guide to this process that I think may help. With regard to your second question, feel free to reach out to help@moz.com so that we can take a closer look Looking forward to hearing from you, Eli

    | eli.myers
    0

  • Hi there! Thanks so much for the great question! I'm so sorry you're having trouble getting your site crawled. Without knowing the exact site you're working with I can't say for sure what's going on but I can offer some suggestions that may help. If you're seeing the crawl is successful with other tools, there may be a bot-specific setting or directive that is banning our crawler, rogerbot, from your site. This may be coming directly from your server or it may be listed in your robots.txt file. I have a troubleshooting guide here that may help- it outlines some other common issues that can keep rogerbot from being able to move forward with the crawl. If you're still having trouble, please feel free to send an email on over to help@moz.com with the site you're having trouble with. That way we can take a look and see what's going on!

    | meghanpahinui
    0

  • Hi Leandro! Thanks so much for the great question! In addition everything that Roman pointed out I wanted to be sure to point you in the direction of some resources we have about Domain Authority, how to use it, and how to improve. You can find information about this here. This resource includes some really useful videos as well.

    | meghanpahinui
    1

  • Yeost could be your problem here, do you use it? 7.2 turned it on its head and redirected the media to a page of its own creating thin content. This is a larger problem than just MOZ, it affects your google index. Yeost released a plugin that eventually deindexes the pages. Check their site for mor information on how to fix the problem.

    | Libra_Photographic
    0

  • I run the training program and can share some testimonials from our recent students below. The current coursework has widely positive feedback, with many students attending more than one class. The one area of feedback we get about improvements (and are addressing) is that our most advanced attendees tend to think the courses are too basic. That makes sense as we have been aiming mostly at the beginner to intermediate user, and have one-off seminars focused on the most advanced user groups. The most popular courses are the Keyword Research, SEO Fundamentals, and Site Audit class. These really focus on practical aspects of delivering SEO. We try to make that our differentiation. Having taken a lot of online classes ourselves, we don't find value in theory-only coursework. Without a practical application, its hard to justify the investment. So we focus on processes, application of concepts, and workflows. Here are some testimonials we've gathered recently: "It's definitely worth the price of admission" "I would highly recommend MOZ training. Covers all the questions you are afraid to ask. Gives an informative insight into SEO and helps you discover how to read and relate to what's in front of you. How it impacts to your business and most importantly how to put it into practice. Bravo! Will definitely sign up for more. Thanks " "I really enjoyed this training. The content was very well organized and essential for learning the basics of SEO. The course was very informative and educational. I highly recommend this course."

    | BrianChilds
    0

  • Grettelp, You can sometimes have more success getting ideas for local keyword phrases using the free Google Keyword Planner tool. It allows you to specify a geography you're interested in - city, municipality, county, township, state, province, country. Google Keyword Planner does not give specific volumes (they give rounded figures for groups of related keywords), but you'll at least get an idea of what people are searching for and can test it. You can also see a sampling of the phrases people are using to find your content by looking at search queries in Google Search Console, also free.

    | DonnaDuncan
    1