Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Scrapebox is a great tool. You can use it to do white hat link building by using it in the research phase of link building. It allows you to find hundreds or thousands of potential websites using search engine commands, check if those sites are indexed in Google, and export into a .txt file or excel spreadsheet. Personally I love using it, because it saves me a ton of time by automating certain aspects of keyword research and outreach target research within specific niche's. If you're using it for just automated blog comments, then you're using a tiny fraction of that tool. Yes you can do blog comments in mass quantities if you want, but that's not the real benefit of a tool like that. Google "How to use scrapebox" and you'll find a ton of resources how to really use that tool, like Neil Patel's write up here - https://www.quicksprout.com/the-advanced-guide-to-link-building-chapter-8/ It's also a lifetime subscription where you'll always be able to get the most recent updates, so it's well worth the price.

    White Hat / Black Hat SEO | | Eric_Rohrback
    1

  • At this point, other than "It's not Penguin, probably" we don't have much insight into what's been going on over the past week, other than that, as Peter N. said, multiple tools are showing rankings shake-ups. If you're talking about a total loss of top pages, though, I think it can be premature to assume it was an update in play. I'd definitely thoroughly check the technical aspects. Are these pages still being index? Are they being cached properly? Do they show up for longer-tail or exact-match terms (in quotes) - in other words, have they dropped in ranking or are they ranking for nothing at all? The more you can pin down, the better. Unfortunately, it's very hard to speak in generalities and tell you what factors were involved in this week's updates. It really takes a deep dive into the site(s) in question.

    Intermediate & Advanced SEO | | Dr-Pete
    0

  • Hi Peasoup, I would surely advice to not simply copy the content to the 4 different domains. I have a few good links for you: https://moz.com/learn/seo/cctlds more info on ccTld's https://moz.com/community/q/duplicate-content-on-multinational-sites  Exactly your issue: make sure you read the comment of Gianluca Fiorelli. Hreflang: https://support.google.com/webmasters/answer/189077?hl=en Good luck with it!

    International Issues | | Tymen
    0

  • If the Penguin is wasn't started in the last few hours, the new "bad" links couldn't have caused a ranking-loss. If you have bad links and you knew it - disavow them or ask the linking people to delete them. Disavow is a bit pointing Google on a thing they may not have noticed. If the next penguin isn't a realtime update - you can't recover until the next-one. So if you have bad links (if) you need to get them away quickly - penguin is coming soon. Go on for link earning by great, shareable & linkable content.  That's hard to do, but the best thing you can do.

    Technical SEO Issues | | paints-n-design
    0

  • Hi Martijn, Yeah, not planning to block them from robots.txt of course. By blocking, I meant reducing the crawl rate to ZERO temporarily to make sure we're not creating any URL related confusions for bots. But, this might not be a good solution for our customers as customer might be redirecting to /new-url for the first hit, which might give him an error for in the next session.

    Intermediate & Advanced SEO | | _nitman
    0

  • Thank you Peter!  I contacted Sucuri's technical support and they made a change to their settings which fixed the problem.

    Social Media | | Lucas25
    0

  • Hello Lee, That's a great question. The links that Peter provided will be helpful in giving you context on this answer. In short, if you had a disavow file uploaded to SC before the migration to https:// all you have to do is make sure the https:// version of your site is validated and submitted to SC, and then re-upload the disavow file to the https:// version. In other words, you should take the disavow file that was on your http:// SC and upload it to the https:// SC as that is now the correct version of the site. Hope that helps!

    Online Marketing Tools | | sergeystefoglo
    0

  • XML sitemap is well defined here: http://www.sitemaps.org/protocol.html But i can quickly resume: limitation up to 50000 URLs and up to 50MB as file. If you need more you can split them as sitemap index with several sitemaps. sitemap index are up to 50000 sitemaps and up to 10MB as file. lastmod, priority and change frequency didn't play HUGE role anymore: https://www.seroundtable.com/google-lastmod-xml-sitemap-20579.html https://www.seroundtable.com/google-priority-change-frequency-xml-sitemap-20273.html but just keep them to be fully formatted. sitemaps can be compressed (gzip) sitemap must be UTF-8 encoded but beware of entities - Ampersand, Single Quote, Double Quote, Greater Than, Less Than. You must replace them with % char codes. you can put sitemap location in robots.txt. You can place there also few sitemaps. Sitemaps can be located on 3rd party servers too. I think that this is most important in XML sitemaps.

    Intermediate & Advanced SEO | | Mobilio
    0

  • Who doesnt need more links? More links due to good content + good user experience/ creating value = bigger business.

    Inbound Marketing Industry | | onlinefun
    0

  • I noticed MozPoint rank was pretty close to the aggregated rank for everybody in the Top 50 list. But when you look at all 4291 users there is some big differences. I think maybe in June I'll do just a top 1000 or have a minimum point requirement.

    Behavior & Demographics | | donford
    13

  • I'm interested to know if this resolved your problem. I'm having the same issue, but with tags. Some of my WordPress blog posts have 4 or 5 tags, and Moz is flagging this as 4 or 5 cases of duplicate content.

    Technical SEO Issues | | Caro-O
    0

  • I don't think there is a one-size-fits all recommendation to make here, which is why that post has so much detail about how to do the research necessary to determine what the best route is for your business. I agree that improving content is better than simply noindexing it, but I also think noindexing it is better than leaving it up long-term unimproved. And the reality is many businesses with tens-of-thousands or hundreds-of-thousands of product pages, and most blogs with thousands of posts, aren't going to be able to economically scale rewriting all of it. The best solution for them, in my opinion, is to get rid of the pages that are dragging them down - at least get them out of the index. They can always be reintroduced once they're improved.

    Intermediate & Advanced SEO | | Everett
    2

  • Yes - backlinks and their anchors are also factor for ranking. For example there are few Google Bombs -> https://en.wikipedia.org/wiki/Google_bomb where many sites give backlink to other site for specific anchors and this make "victim" site N:1 for this keywords. So - you can evaluate backlinks and their anchors with many tools: Moz OpenSiteExplorer Majestic Ahrefs WebMeUp So far from all tools OSE is free, all other require monthly subscription. This make OSE N:1 in this list. OSE also provide "Link Opportunity".

    Link Building | | Mobilio
    0

  • Even if you implemented all the tags correctly there is no guarantee that Google will show the them in the SERP's - the choice whether or not to show them is entirely up to them. I guess you already checked if all the tags are properly implemented (if not - check here or here) You could also check this page for the usage guidelines (bottom of the page). Again - even if you are doing everything correct & you meet the guidelines - it's entirely up to Google if the snippets will be shown or not. You can't control it. Dirk

    Intermediate & Advanced SEO | | DirkC
    0

  • You don't have to put index & follow - that the default behaviour. It's only when you don't want to be indexed or links to be followed that you have to indicate it. Rel next/previous have to be in the header - check https://support.google.com/webmasters/answer/1663744?hl=en  - not on the links themselves. Dirk

    Technical SEO Issues | | DirkC
    0