Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • I have read all over the place how ignoring these domains Is the best policy, but disavowing the globe does improve search presence, so always disavow them. I have built a utility to target the globe and others. https://dynamic.domains/disavow-utility.zip The utility is hardly refined, and a work in progress but entirely functional and will make short work of the Globe. It can scan all domains in your linking domains file and search for specific text. If you are like me and sorting through thousands of links this can be  life saver, as it will open each domain and scan for whatever keywords you like You can search multiple keywords, just separate by comma. The below search text will weed out the Globe and a ton more. keyword research,See related links to what you are looking for,popstripeRs,Search_Engine_Optimization_Services,Keyword Suggestions,This domain may be for sale,data-adblockkey,List website,complete online resource for web host providers,URLs categories for query and submission,Estadísticas web y datos de valoración simplificados,Ashibka.ru,Check Website IP on Server,Recently Analyzed Sites,research and analysis,Tools check keyword with Search Engines,1000 domain,the_worlds_most_visited If you want to just single out the globe use the_worlds_most_visited, and it will get them all, and build a nice list for you.

    Link Building | | samdland
    2

  • In my experience, it will help the overall site, but still... do not expect a huge impact on these. URLs are shared, but I don't believe people will start to link to them except for private conversations.

    Technical SEO Issues | | Keszi
    0

  • Hey Todd, Thanks for reaching out to us! Keyword Explorer is updated about every 2 weeks. Also a keyword API is on our roadmap, and we'd love to hear your feedback about it. We have a short survey available here: https://www.surveymonkey.com/r/YZ58JVH I hope this helps but definitely let me know if there's anything else I can assist with! Eli

    API | | eli.myers
    0

  • Hi yamayamax, Did you see Miriam's response to your question? Please let us know if that helped answer your question or not. Thanks so much! Christy

    Local Listings | | Christy-Correll
    1

  • The truth is no one knows... That's why the algo is such an evolving mystery It's also different for every niche. For example in an easy niche, you can fix a page's SEO and a couple of days later it will move up the rankings 5 spots, say from position 12 to position 7. Then in a competitive niche you can spend 100 hours doing onpage SEO and the page position might improve by 1 spot over the course of 3 weeks etc. The best way to test different on-page factors is with single variable tests in isolation using fake keywords like the guys at SIA (Search Engine Intelligence Agency) do it. Then you pin down the most effective tweaks and bust a lot of myths as well.

    Intermediate & Advanced SEO | | Dezzign
    0

  • In that case, you can create some rules in your robot.txt file. All depends on the configuration of your site. Also, you need to check on your search console and your crawl budget. As I mentioned all depends on your site. If you deal with 10 new users per day, just take it easy, config your robot.txt file in the other hand if you deal with 1000 or 10000 users, in that case, you will need to think in a better solution. The first idea that comes to my mind is to create a script on javascript who evaluate some parameters on those pages and if meet the parameters (do not add the tag) if not **(add the tag) **

    Intermediate & Advanced SEO | | Roman-Delcarmen
    0

  • Hi Fabrizio, I'd advise you to link to your regular (non AMP) page. Make sure that this page includes a tag with a link to the AMP version of the page. The AMP page should then contain a rel canonical tag to the non AMP version. Because of this canonical tag it makes more sense to focus your link building efforts on the non AMP version of the page. See https://www.ampproject.org/docs/fundamentals/discovery for more detailed information on how to include the tags on both pages. Hope it helps!

    Technical SEO Issues | | joramtenham
    0

  • Hi James, So far as I can see you have the following architecture: job posting: https://www.pkeducation.co.uk/job/post-name/ jobs listing page: https://www.pkeducation.co.uk/jobs/ Since from the robots.txt the listing page pagination is blocked, the crawler can access only the first 15 job postings are available to crawl via a normal crawl. I would say, you should remove the blocking from the robots.txt and focus on implementing a correct pagination. *which method you choose is your decision, but allow the crawler to access all of your job posts. Check https://yoast.com/pagination-seo-best-practices/ Another thing I would change is to make the job post title an anchor text for the job posting. (every single job is linked with "Find out more"). Also if possible, create a separate sitemap.xml for your job posts and submit it in Search Console, this way you can keep track of any anomaly with indexation. Last, and not least, focus on the quality of your content (just as Matt proposed in the first answer). Good luck!

    Intermediate & Advanced SEO | | Keszi
    0

  • Istvan has given some great advice here - one other thing I would add about the gibberish URLs though: especially with the format of these, it's possible that the site may have been the victim of a hack. Make sure you get the site scanned for malware or other hacking activity and going forward have a security expert ensure that the site is secure (especially if it's using a platform like Wordpress which is vulnerable to hackers).

    Technical SEO Issues | | bridget.randolph
    0

  • I run the training program and can share some testimonials from our recent students below. The current coursework has widely positive feedback, with many students attending more than one class. The one area of feedback we get about improvements (and are addressing) is that our most advanced attendees tend to think the courses are too basic. That makes sense as we have been aiming mostly at the beginner to intermediate user, and have one-off seminars focused on the most advanced user groups. The most popular courses are the Keyword Research, SEO Fundamentals, and Site Audit class. These really focus on practical aspects of delivering SEO. We try to make that our differentiation. Having taken a lot of online classes ourselves, we don't find value in theory-only coursework. Without a practical application, its hard to justify the investment. So we focus on processes, application of concepts, and workflows. Here are some testimonials we've gathered recently: "It's definitely worth the price of admission" "I would highly recommend MOZ training. Covers all the questions you are afraid to ask. Gives an informative insight into SEO and helps you discover how to read and relate to what's in front of you. How it impacts to your business and most importantly how to put it into practice. Bravo! Will definitely sign up for more. Thanks " "I really enjoyed this training. The content was very well organized and essential for learning the basics of SEO. The course was very informative and educational. I highly recommend this course."

    Getting Started | | BrianChilds
    0

  • Thanks Tim, By recently I mean early December 2017 more or less. We were also rebuilding our booking engine and I think I underestimated the care needed in a site change - we changed domain name, url structure and lost some content ... I'm fixing all the issues such as re-adding the content, making sure all technical issues are sorted out etc. before I worry about switching back to the old url, My instinct was to switch back and all will be well but it's clearly not that simple! Thanks again, Dan

    Technical SEO Issues | | DanWrightson
    0

  • Hi Matt! Short answer: as long as redirection are correct, stick with the chosen version (non-www). Don't change from time to time, Google does not like that you are changing too much. PA and DA will be transferred This is a business/company decision, where there should be decided to go with www or non-www version of the web.  In SEO perspective, there is no difference.  Also, keep in mind that PA, DA or any other private metric are only a measure and a guess (with certain clue) how likely is a web to rank or to be compared with other similar webs. That said, either way you decide, be sure that all redirections are set in place correctly. It could be helpful the checklist this articles: The Website Migration Guide: SEO Strategy, Process, & Checklist - Moz blog A site migration SEO checklist: Don’t lose traffic - SearchEngineLand SEO Site Migration Checklist: How to migrate your website and not kill your SEO efforts - GetCredo Hope it helps. Best Luck. GR

    Intermediate & Advanced SEO | | GastonRiera
    0

  • Hi Rachel, I think István makes a good point about the translated urls but just as a quick follow-up to your original question - it should not cause technical problems to have the page names the same while they are in different directories, because the total file path should be different, as long as you have hreflang properly set up. Regarding your question about canonical tags - I would not canonicalise some of these language variants to other language variants, even if you do decide to make the page names the same. Hreflang is saying "these two pages are different language variants of the same thing" whereas canonical tags are saying "this page is just the same as this other thing" - the canonical tag doesn't have the language component so could conflict with your hreflang and cause errors with things like return tags. At the very least it could confuse Google as to which page should rank in which country, for instance, how can the /de/ page rank in Germany if we're telling Google it's not the canonical version of the page, but the other one from root is? Hope that helps!

    Technical SEO Issues | | R0bin_L0rd
    0

  • Hi there! I'm really sorry for the trouble here. This is actually a known issue with the MozBar at this time. For some sites an incorrect country is detected. Our engineers are aware of the issue, but until it is resolved we recommend ignoring this field. Sorry about that!

    Other Research Tools | | moz_support
    1

  • Hi there! Sorry for any confusion. If you're seeing odd URLs show up in your Top Pages report within Link Explorer or the Links section of your campaign, it is because we found a link out in the internet that leads to that page. It is entirely possible that whichever site it is coming from has an incorrect or inaccurate URL for your site. We could definitely help take a look at this if you email in the campaign details to help@moz.com! Thanks.

    Link Explorer | | moz_support
    0

  • Hi Remko, Apologies for the slow response here - the alert system that lets us know a question has gone unanswered broke down for a time. The short answer: the ideal is to render the content/information as raw HTML, then use a JS library (whatever suits the animation or chart style you're after) to visualize and add animations. Animations handled with jQuery, for example, won't be processed/"viewed" by Google, so if you're using such a library to add polish to your content, it's best that the page "degrades gracefully" - so that without JS support, the key content/information is still there in the HTML source. While Google has made strides in their ability to render and index content delivered via JS resources, it is computationally expensive and we have seen that relying too heavily on JS to render the content itself is sub-optimal. If you're following the above, any modern and widely-adopted JS animation/chart-building library should be fine. Hope this helps, Mike

    Technical SEO Issues | | MikeTek
    1