Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Moz Tools

Chat with the community about the Moz tools.

Subcategories

  • Get up and running with the Moz tools.

    561 Questions
    2k Posts
    elonmmusk

    You'll need to build quality backlinks to increase your da/pa in Moz, You'll need quality links from high authority sites..I have recently increased my da for my international movers business site by building high authority quality links

  • Discuss the Moz Pro tools with other users.

    823 Questions
    4k Posts
    bilaljkdfgsaui

    I am also facing same issue on My website, If you found any solution Please let me know. Thanks

  • Chat keyword research strategy and how Keyword Explorer helps you do your best work.

    8 Questions
    23 Posts
    fuadahmadi928

    maybe the site owner blocking access from MOZ

  • Cover all things links and the industry-leading link data discoverable in Link Explorer.

    679 Questions
    3k Posts
    samantha.chapman

    Hello! Sam from Moz's Help Team here! So -  after being found, newly discovered links have the ability to be populated into our index in about 3 days. However, there are a lot of factors which can affect our ability to find and index links to your site. It's important to note that we are always adding new data to our index, but it may take some time for us to discover backlinks to your site based on factors like crawlability of the referring pages, quality of the links and the referring pages, and more. If you are not seeing links that you know you have, you may want to make sure that they can be indexed. It is also a good idea to check to see if we've indexed the page on which that link is found. If we haven't indexed the referring page yet, you won't see your link in our index. You can also add links to Link Tracking Lists. Once you add a link to your tracking lists we will add that page to be crawled. As long as it is accessible to our crawler, you should see the link in our index as soon as we can index those pages. Lastly, I have a great guide here with some things to check around why we may not have found your links yet: https://moz.com/help/link-explorer/link-building/moz-isnt-finding-your-links If you'd like any further information, please feel free to pop us an email over at help@moz.com. We do also have a great guide to Domain Authority just here: https://moz.com/learn/seo/domain-authority

  • Find insights and conversations specific to the Research Tools within Moz Pro.

    989 Questions
    4k Posts
    aseu

    Can I add this at my website tenchoicez.com for bulk checking

  • Discuss the Moz Local tool with other users.

    316 Questions
    1k Posts
    eli.myers

    Hey there! Thanks for reaching out to us! I'm sorry to hear about this - would you be able to reach out to help@moz.com so we can take a closer look please. Looking forward to hearing from you,

  • Discuss link data, metrics, and all of the calls available through the Links API.

    223 Questions
    1k Posts
    adamsmith47

    Hi, No, MOZ does not have any option to disavow links and you should not be worried about disavowing links in Moz. Instead, disavow them from the Google Search Console because Google is the search engine that ranks your site according to proper linking.

  • Find expert assistance to help you troubleshoot technical issues with the Moz tools.

    529 Questions
    2k Posts
    HussainAwan

    its interesting can you please leave a screen shot or link to investigate the  solution. For reference check my keyword it showing in featured snippet Legal Translation Dubai

  • Let us know about features and functionality that you’d like to see in the Moz tools.

    159 Questions
    625 Posts
    eli.myers

    Hi, Great question, Link Explorer and the Links tab of Moz Pro Campaigns are both tied to our Link index, which is constantly updating. After being found, newly discovered links have the ability to be populated into our index in about 3 days. When discovered or lost links are found, we'll update our database to reflect those changes in your scores and link counts. We prioritize the links we crawl based on a machine learning algorithm to mimic Google's index. This does not mean that DA and PA will change with every data update, though; it will only change if we find new link data for a respective site. I'm sorry I can't tell you exactly when your DA will update it depends on when we find new equity passing backlinks to your site. You can read more about our new Link Explorer tool and our index here. ​ You can also read more about how our Link index compares with our competitors here https://backlinko.com/best-backlink-checker Feel free to reach out to help@moz.com with any further questions

  • Have a question that doesn’t quite fit in another category? Drop us a line here.

    418 Questions
    2k Posts
    hafixali1234

    google drawing Toto 4d result drawing

  • Learn about news around the Mozplex and projects that Mozzers are working on.

    230 Questions
    2k Posts
    BartonInteractive

    Hi snjaoieiw, To get a detailed answer from Moz staff on what DA is, you might consider searching the Q&A forum. In short, though, it is a Moz metric (not a Google or Bing metric) that takes into consideration the number (and quality) of backlinks your website has. That said, have you been working on building up high quality backlinks? -Zack


  • No problem! Yes, same is true for HTTP and HTTPS

    | LoganRay
    0

  • Good point Eric! But the number AND quality of links still matter - even if it isn't called PageRank. Just clarifying that link authority and relevancy is being passed from one site to another...

    | bbkahlich
    0

  • As I understand, you get referral traffic from websites like; traffic-cash and those with .xyz extension. There is a great article on how to filter those: https://www.ohow.co/what-is-referrer-spam-how-stop-it-guide/

    | solvid
    0

  • You do have an empty meta description tag in the header of each page: But it is followed by a meta description tag with content. Perhaps the crawler is getting caught up on the first one and reporting that?

    | Brando16
    0

  • The Local SEO test is at https://moz.com/blog/local-search-expert-quiz-2016\. I would use it as a jumping off point for making your own test (and randomizing the questions), rather than simply having potential candidates take the Moz one. https://moz.com/blog/local-search-expert-quiz-2016

    | Christy-Correll
    1

  • Hi Sarah, Somehow I answered this and I must have forgotten to post the answer! Arg, it was a long one, too. Let me try to summarize what I'd do: -If possible, noindex any page that doesn't display content while not logged in. Wait for those pages to drop out of the index, and monitor for errors. If not possible, skip straight to blocking pages behind a login wall with robots.txt. For example, to block anything in the login folder: Disallow: /login Or to block anything with a login variable: Disallow: /*?login This should prevent bots from crawling those URLs where you don't have any content to show them. Make sure to use this carefully. I do apologize for the delay. If you have additional questions please feel free to PM me. I'd be happy to do a quick consult online or over the phone, as I feel bad that I never actually answered, and I can give you more specific ideas if we look at the site. If this answers your question that's fine too. Good luck!

    | Carson-Ward
    0

  • I know it's not always the answer people want to here, but Matt's right - this is basically where we're at. OSE tends to focus on higher-authority links and quality over quantity. Unfortunately, while this works well for tracking the strengths in your link profile, it doesn't always do as well at tracking the weaknesses. We're very much interested in expanding the quantity as well, but it's a balancing act and, in the interest of full transparency, there are many engineering challenges. People have compared our index to Majestic and Ahrefs on the blogosphere. Since I can't claim to be unbiased, I'd welcome you to read those posts and make your own judgments. In fairness to Majestic and Ahrefs, all three of us are somewhat transparent about sources and at least our general methodologies. Unfortunately, Google is not very transparent about how they sample links or choose which data to show. So, direct comparison with any of the major SEO tools to Google Search Console proves to be a lot trickier. We're also not clear on Google's update cycle for that data.

    | Dr-Pete
    0

  • Looking at domain authority is like looking at the size of the dog in the fight instead of the size of the fight in the dog. If you can beat their content for that keyword and know how to promote it then you should attack without fear.

    | EGOL
    0

  • Hi Simon, WordPress isn't great at handling slideshows and paginated pages in order for SEO. What would be best to do is have a developer look into this and make sure that canonical URLs are added to the page in order to respect the duplicate content. There are as far as I know no out of the box solutions for this.

    | Martijn_Scheijbeler
    0

  • Hey Richard! It appears your site might be intermittently placing these titles on the page. Looking at cached data you can find the titles we picked up last week view-source:http://webcache.googleusercontent.com/search?q=cache%3Aelectroustic.co.uk%2Fconnectivity-products%2Findustrial-ethernet%2Faccessories&oq=cache%3Aelectroustic.co.uk%2Fconnectivity-products%2Findustrial-ethernet%2Faccessories&aqs=chrome..69i57j69i58.1179j0j4&sourceid=chrome&es_sm=91&ie=UTF-8 view-source:http://webcache.googleusercontent.com/search?q=cache%3Ahttp%3A%2F%2Felectroustic.co.uk%2Fconnectivity-products%2Findustrial-ethernet%2Fwlan-industrial-wireless-lan&oq=cache%3Ahttp%3A%2F%2Felectroustic.co.uk%2Fconnectivity-products%2Findustrial-ethernet%2Fwlan-industrial-wireless-lan&aqs=chrome..69i57j69i58.1337j0j1&sourceid=chrome&es_sm=91&ie=UTF-8 Hope this helps!

    | DavidLee
    0

  • Here is an update as to what is happening so far. Please excuse the length of this message. The database according to the host is fine (please see below) but WordPress is still calling https: In the WP database wp-actions, http is definitely being called* All certificates are ok and SSL is not active* The WordPress database is returning properly* The WP database mechanics are ok* The WP config-file is not doing https returns, it is calling http correctly They said that the only other possibility could be one of the plugins causing the problem. But how can a plugin cause https problems?...I can see 50 different https pages indexed in Google.  Bing has been checked and there are no https pages indexed there. All internal urls always have been http only and that is still the case. I have Google fetched the website pages and in the 50 https pages most are images which I think probably must have came from the Yoast sitemap which was originally submitted to the search engines (more recently though I have taken all media image url's out of the Yoast sitemap and put noindex, follow on all image attachments files (the pages and the images on the pages will still be crawled and indexed in Google and search engines, it just means that any image url's won't. What will happen to those unwanted https files though? If I place rel canonical links on the pages that matter will the https pages drop out of the index eventually? I just wish I could find what is causing it (analogy: best to fix a hole in a roof to stop having to use a bowl to catch the water each time it rains). ** I looked at analytics today and saw something really interesting (see attached image) - you can see 5 instances of the trailing slash at the home page and to my knowledge there should only be 1 for a website. The Moz Crawl shows just 1 home domain  http://example.co.uk/ so I am somewhat confused. Google search results showed 256 results for https url references, and there were 50 available to click on. So perhaps there are 50 https pages being referenced for each trailing slash (could there be 4 other trailing slash duplicate pages indexed and how would I fix it if that is the case?). This might sound naive but I don't have the skillset to fix this at this time so any help and advice would be appreciated. Would Search and Replace plugin help at all or would it be a waste of time since the WordPress database mechanics seem to be ok. I can't place any https to http 301 redirects for the 50 https url's that are indexed in Google, and I can't add any https rewrite rules in htaccess since that type of redirect will only work if a SSL is active. I already tried several redirect rules in htaccess and as expected they wouldn't work which again would probably mean that the SSL is not active for the site. When https is entered instead of http, there should be an automatic resolve to http without me having to worry about that, but I tried again and the https version with a red diagonal line through it appears instead. The problem is that once a web visitor lands on that page they stay in that land of https (visually the main nav bar contents stretch across the page and the images and videos don't appear), and so the traffic will drop off..so hence a bad experience for the user and dropped traffic, decreasing income and bad for seo (split page juice, decreased rankings). There are no crawl errors in Google Search Console and Analytics shows Google Fetch completed for all pages - but when I request fetch and render for the home page it shows as partial instead of completed. I don't want to request any https url removals through Google and search engines - it's not recommended because Google states that http version could be removed as well as https. I did look at this last week: http://www.screamingfrog.co.uk/5-easy-steps-to-fix-secure-page-https-duplicate-content/ Do you think that the https urls are indexed because of links pointing to the site are using https?  Perhaps most of the backlinks are https but the preferred setting in Webmaster Tools / Search Console is already set to the non-www version instead of the www version; there has never been a https version of the site. This was one possibility re duplicate content. Here are two pages and the listed duplicates: The first Moz crawl I ever requested came back with hundreds of duplicate errors and I have resolved this. Google crawl had not picked this up previously (so I figured everything had been ok) and it was only realised after that Moz crawl. So https links were seen to be indexed and so the goals are to stop the root cause of the problem and to fix the damage so that any https url's can drop off out of the serps and the index. I considered that the duplicate links in question might not be considered as true duplicates as such - it is actually just that the duplicate pages (these were page attachments created by WordPress for each image uploaded to the site) have no real content so the template elements outweighed the actual unique content elements which was flagging them as duplicates in the moz tool. So I thought that these were unlikely to hurt as they were not duplicates as such but they were indexed thin content. I did a content audit and tidy tidied things up as much as I could (blank pages and weak ones) hence the new recent sitemap submission and fetch to Google. I have already redirected all attachments to the parent page in Yoast, and removed all attachments from the Yoast sitemap and set all media content (in Yoast) to 'noindex, follow'. Naturally it's really important to eliminate the https problem before external backlinks link back to any of the unwanted https pages that are currently indexed. Luckily I haven't started any backlinking work yet, and any links I have posted in search land have all been http version.  As I understand it, most server configurations should redirect by default to http when https isn’t configured, so I am confused as to where to take this especially as the host has given the WP database the all clear. It could be  taxonomies related to the theme or a slider plugin as I have learned these past few weeks. Disallowing and deindexing those unwanted http URLs would be amazing since I have so far spent weeks already trying to get to the bottom of the problem. Ideally I understand from previous weeks that these 2 things would be very important: (1)301 redirects from http to https (the host in this case cannot enable this directly through their servers and I can only add these redirects in the htaccess file if there is an active SSL in place).(2)Have in place a canonical url using http for both the http and https variations. Both of those solutions might work on their own and if the 301 redirect can't work with the host then the canonical will fix it?  I saw that I could just set a canonical with a fixed transport protocol of http:// - then Google will then sort out the rest. Not preferred from a crawl perspective but would suffice? (Even so I don't know how to put that in place). There are around 180 W3C validation errors. Would it help matters to get these fixed? Would this help to fix the problem do you know? The homepage renders with critical errors and a couple of warnings. The 907 Theme scores well for its concept and functionality but its SEO reviews aren't that great. Duplicate problems are not related to the W3 Total Cache plugin which is one of the plugins in place. Regarding addons (trailing slash): Example: http://domain.co.uk/events redirects to http://domain.co.uk/events/  the addon must only do it on active urls - even if it didn't there were no reports of  / duplicate errors in the Moz Crawl so its a different issue that would need looking at separately I would think. At the bottom of each duplicate page there is an option for noindex. There are page sections and parallax sections that make up the home page, and each has to be published to become a live part of the home page. This isn't great for SEO I understand that because only the top page section is registered in Yoast as being the home page the other sections on the home page are not crawled as part of the home page but are instead separate page sections. Is it ok to index those page sections? If I noindex, follow them would that be good practice here. The theme does not auto block the page section from appearing in search engines. Can noindex only be put on whole pages and not the specific page sections? I just want to make sure that the content on all the pages (media and text) and page sections are crawlable. To ultimately fix the https problem re indexed pages out there could this eventually be a case of having to add SSL to the site just because there is no better way - just so the https to http redirect rule can be added to the htaccess file? If so, I don't think that would fix the root cause of the problem, but the root cause could be one of the plugins? Confused. With Canonical url's does that mean the https links that don't have canonicals will deindex eventually? Are the https links giving a 404 (I'm worried because normally 404's need 301's as you know and I can't put a 301 on a https url in this situation). Do I have to do set a canonical for every single page on the website because of the extent of the problem that has occurred? Nearly all of the traffic is being dropped after visiting the home page, and I can't for the life of me see why. Is it because of all these https pages? Once canonicals are in place how long will it take for everything to return to how it should be? Is it worthwhile starting a ppc campaign or should I wait until everything has calmed down on the site? Is this a case of setting the canonical URL and then the rest will sort itself out? (please see the screenshot attached regarding the 5 home pages that each have a trailing slash). This is the entire current situation. I understand this might not be so straight forward but I would really appreciate help as the site continues to drop traffic and income.  Others will be able to learn from this string of questions and responses too. Thank you for reading this far and have a nice day.  Kind Regards,

    | SEOguy1
    0

  • Sounds like a good idea. I'm just trying to provide basic data on a potential clients' site - traffic, rankings, main errors and things that are working for them. Some tools actually do a pretty good job of this, but only seem to do this for the homepage, rather than the entire site. I guess I'll have to use multiple tools and hope the effort is worth it as it's a free service. Do you use SEO profiler?

    | CamperConnect14
    0

  • Hi, any update on this thread? I'm facing the same dilemma currently. I need to know exactly how much more powerful a DA of 50 is vs. 30, for example. If there's any way to access the most top-level estimation of the logarithmic scale, it would be greatly appreciated.

    | ggpaul562
    0

  • Great thank you- it was pressurised situation!

    | SEM_at_Lees
    0

  • Just Answered here: https://moz.com/community/q/what-is-the-best-way-to-add-a-noindex-nofollow-meta-tags-to-tags-in-a-blog

    | GastonRiera
    0

  • Hi again Stephen, Have you read the Yoast Tutorial? There is a section that details the noindex tags. This is the article: YOAST tutorial - Noindex Tag There is other way to do it: Manually with the robots.txt. Here a Moz tutorial - Robots.txt Hope it helps. GR.

    | GastonRiera
    0

  • Alvaro - has detailed it well.... Also add in Semrush.com and Keyword planner... then if it is a different market (country), I try and speak to a local.. as sometimes there are whole new phrases or words that you could never be aware of... On semrush, if you know the competitors you can undertake alot of keyword research from reviewing the keywords your competitors are targeting...

    | ClaytonJ
    0

  • Hi Candice, Would you mind asking this question in a new thread? Thanks so much! Christy

    | Christy-Correll
    0

  • Thanks again Dirk. I like your direct and knowledgeable responses. I have sent a Linkedin connection!! Many thanks, Sarah

    | Mutatio_Digital
    0