Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hey there! Thanks for reaching out to us! It sounds like it could be a server block - would you be able to contact us at help@moz.com Thanks! Eli

    Other Research Tools | | eli.myers
    0

  • Hey there! Thanks for reaching out to us! The data contained here is the only information we pull in using your GA profile: https://moz.com/help/moz-pro/traffic-from-search/overview Feel free to reach out to help@moz.com with any further questions, Best, Eli

    Technical Support | | eli.myers
    0

  • I have try to check it out keywords from Mediachowk  But unable to get all keywords. not updated in moz.

    Keyword Explorer | | nkodiou
    0

  • Hey there! Thanks for reaching out to us! We recognise exact matches found within the source code of your page. We struggle with parsing Javascript so typically the Keywords are found within HTML. Feel free to reach out to help@moz.com with any further questions. Best, Eli

    Other Research Tools | | eli.myers
    0

  • There are tools, maybe without notification, maybe with. But they are often times not 100% accurate. To be accurate you should check GSC external Links. Tools show you thinks often times late or links google don't knows (maybe). Think ahrefs, semrush or even moz linkexplorer is checking links including new and gone links. You can include GSC Links in semrush - so the basis are accurate.  Bet thats the same for some other tools, but I only know it for sure in semrush. You can get a report from semrush so it maybe fits your needs.

    Link Building | | paints-n-design
    0

  • Hi, Thanks for your answer. As I say before I'm having this problem even when I take off all the structure data from tag manager. Not sure what is generating that schema. Any ideas? Regards,

    Intermediate & Advanced SEO | | Alexanders
    0

  • Yikes I just realized something and let me add onto this story. Maybe this is the issue and cause of the 301 not really taking effect and the new domain almost seeming to start from scratch. The old domain was lets say for example 123 paper company with a focus on the same topic.  The old design was also updated when we did the 301. When we switched the domain the client changed addresses (yet again) , changed his business name,  changes his website design and kept the same content.  The content has been updates and modified a little but its pretty much the same. Since the client made all these changes at once I feel google might be negating the 301 benefits because it feels like the domain was sold and redirected to a new law firm. what does the community think?

    Technical SEO Issues | | waqid
    0

  • Well if you do change the URL and setup the 301 redirect, you can always change, and should change, all the internal links to point to the new URL. As far as losing any PR, Google announced back in 2016 that 301 and 302 redirects no longer lose PageRank. You can read about it in Cyrus Shepard's post, https://moz.com/blog/301-redirection-rules-for-seo. He states: "While it’s super awesome that Google is no longer “penalizing” 301 redirects through loss of PageRank, keep in mind that PageRank is only one signal out of hundreds that Google uses to rank pages. Ideally, if you 301 redirect a page to an exact copy of that page, and the only thing that changes is the URL, then in theory you may expect no traffic loss with these new guidelines." It sounds like the only thing that will change in your situation is your URL so you should be able to move forward with confidence that 301 redirecting /marvel/avengers/hulk to /marvel/hulk won't have any sort of negative effects, (at least not long lasting). But if you're still feeling cautious, it's fine to not change this one URL.

    Intermediate & Advanced SEO | | Nozzle
    0

  • Hi there! On the one hand, site: number of results is not the exact nor the current amount of URLs that Google has indexed. It's only just a way of seeing how much results there are. Google said that it's optimized for speed, that why it's an estimated number. You should trust whats reported in Search Console. On the other, it's possible that Google has checked an URL and considered as valid on one time and not index it, because of canonicals, similar content or any other reason. If you find some URL not indexed, yet reported as indexed in the coverage report, try checking it through the Inspect URL tool, then asking for the TEST LIVE URL option. There you may find some answers. Hope it helps. Best luck. Gaston

    Intermediate & Advanced SEO | | GastonRiera
    0

  • When you syndicate content, you give one of your articles to another website.  They publish it on a page and their page begins to rank in the SERPs for the keywords of the article.  If their website is more powerful than your website they are most likely going to rank above you in the SERPs for the root, short tail and long tail keywords.  Even if they give you a link in the article, their page is going to rank above your page in the SERPs. If you give one of your articles to several other websites and they publish it, the result will be several pages in the SERPs with your article.  What happens then, is Google sees all of these identical articles.  They don't like that and they will filter most of those articles and send them to the supplemental index - where they will get almost no traffic. Some people will argure....  "I published my copy of the article first and Google will rank me better or they will not filter my page."   From my experience, those people are wrong much of the time at best and most of the time at worst.   I know this for a fact because I have published articles given to me by other people and my site almost always outranks their site in the SERPs after I publish.  I warn them that this will happen, and let them know if I publish their article I will not remove it.  After that, they don't offer me any more articles. Syndication should be done for one reason.   You have a message and you want to get it out everywhere.   For that purpose you can give your article to a lot of other websites and they will display it to their audience. However, if you are trying to monetize a website, syndicating articles will most likely be damaging to your rankings for these reasons.  1) your articles on other websites will outrank you, 2) sometimes your article on your website will be filtered,  3) if Google sees that you are trying to build links from article syndication they might turn off the value of those links or even use them as part of a penalty, and 4) if you give lots of the articles on your website to other websites Google might see your site as "having nothing unique" and demote your entire site in the rankings.

    Moz Tools | | EGOL
    0

  • We had a similar situation and a Moz rep (thanks Samantha) pointed out in the Help material (very close to what she wrote in the earlier post) about the 90% figure. We realized the reason for the message -  because the affected pages had very little content and the page header section and the page footer section (word press site) dominated the page content. We realized the issue could be ignored in our case.

    Other Research Tools | | 7thPowerInc
    2

  • Very good to know this ... was concerned that it is exactly as you are describing (quite a cost to remove tags)! Thanks for your response.

    On-Page / Site Optimization | | JakeWarren
    1

  • Hey there! Thanks for reaching out to us! Link Explorer and the Links tab of Moz Pro Campaigns are both tied to our Link index, which is constantly updating. Our index updates daily! Moz crawls and indexes billions of pages, adding fresh link data every day. When discovered or lost links are found, we'll update our database to reflect those changes in your scores and link counts. We prioritize the links we crawl based on a machine learning algorithm to mimic Google's index. Each day DA and PA will be updated to reflect this new data. This does not mean that DA and PA will change every day; it'll only change if we find new link data for a respective site. Feel free to write to us at help@moz.com so we can take a closer look at your website please. Looking forward to hearing from you, Eli

    Link Explorer | | eli.myers
    0

  • Hi Michael, SEO Text book and best practices, if there are any, say that whenever there are multiple pages that serve the same content ,and you acknowledge that only one of them is critical: redirect other pages to the primary. Thinking like this: redirect /index.asp to the root page. Just for the sake of clearing a little about the "there is some extra juice": Analyzing both pages through Moz's or any other private metric could lead you to wrong conclusions. Here, Moz creates its metrics with an algorithm taking into account links and some other magic.  Here some information about Moz's Page Authority Thinking like that, its perfectly normal that some other page gets more links and have different PA, even though being canonicals of each other. It also reinforces the idea that it should be redirected. On a side note, Google publicly said that doesn't use metrics like Moz's to rank a website/page. Just having a canonical is almost enough to transfer its authority. Hope it helps. Best luck Gaston

    Web Design | | GastonRiera
    0