Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
International Blog Structure & Hreflang Tags
Hey there–assuming you've correctly implemented your hreflang tags (as alternate inks in your head codes, see example below) then Googlebot should not see these as duplicates of each other. It's hard for me to weigh in on the other element of your question, "my blog structure is not standardized," without examples. How are you differentiating your international content in your URL structures? Another question: are these blog articles actually localized? Meaning, is there custom content (e.g. events in X location) or differences (e.g. spelling conventions) for these regions? If not, you may not even need these international versions of your blog content if it's not being translated.
Intermediate & Advanced SEO | | zeehj0 -
Does Moz have a way to export full SERPs yet?
Hi there! Thanks so much for the great question! Within Keyword Explorer you can export the SERP analysis as a CSV. This will show you the top organic results for the keyword you're searching. Within your Campaign itself, you can see the top 50 results for a keyword you're tracking within the Analyze a Keyword section but unfortunately we don't have a way to export that at this time. I will be sure to pass along the feedback to our team here, though, so they know this is something you'd like to see in the future. If you have any other questions about our tools, please feel free to send an email on over to help@moz.com.
Other Research Tools | | meghanpahinui0 -
HTML site can occupy top positions in Google?
Yeah, with above. There is absolutely no difference if your making your website in PHP, HTML 4 or 5, it all comes down (the output) to the very same, HTML! A CMS is just a way to 'manage' the content on your website in a dynamic way, without FTP'ing your way around with files. With dynamic it's simply being extracted from often a SQL related database or so.
Intermediate & Advanced SEO | | Jvanderlinde0 -
Broken canonical link errors
Great, thanks for your note Paul, I will filter through as you suggest!
Technical SEO Issues | | GhillC1 -
My Domain has a couple of badlinks decreasing my rankings, will disavowing them reduce my Domain Authority on Moz?
Using link research tools I was able to find out you have approximately Backlinks 569K with only Ref Domains 1K that is not a great ratio. One of the things I think that is extremely important that you know in conducting research on your site using builtwith.com I was able to find out that you have different canonical's for three of your four possible domains https://www. https:// http://www. http:// This is something I noticed when looking at your site and has a lot to do with link equity all of your redirects are pointing to different URLs with different canonical's https://i.imgur.com/hJz4w1o.png This is an excellent tutorial on how to force HTTPS using Nginx or Apache https://kinsta.com/knowledgebase/redirect-http-to-https/ if you need a lot of help with this stuff Kinsta also a very good WordPress hosting company **If your web server is running Nginx, ** server { listen 80; server_name nightwatchng.com www.nightwatchng.com; return 301 https://nightwatchng.com$request_uri; } If your web server is running Apache, RewriteEngine On RewriteCond %{HTTPS} off RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] You can also just use plug-in but I recommend not using plug-ins if you don't have to this one's very lightweight https://wordpress.org/plugins/really-simple-ssl/ you may also need to use redirect rules like this one https://www.aleydasolis.com/htaccess-redirects-generator/ https://www.aleydasolis.com/htaccess-redirects-generator/www-to-nonwww/ <ifmodule mod_rewrite.c="">RewriteEngine On RewriteCond %{HTTP_HOST} ^www.nightwatchng.com$ RewriteRule (.*) https://nightwatchng.com/$1 [R=301,L]</ifmodule> A similar tool for nginx https://www.scalescale.com/tips/nginx/nginx-redirection-www/ Redirect non-www to www for all your domains Place this code inside your server block {}: **return 301 $scheme://$1$request_uri ;** The full example in a server block {} to: server { server_name "~^www.(.*)$" ; return 301 $scheme://$1$request_uri ; } Redirect www to non-www Redirect www to non-www for a single domain Place this code inside your server block {}: return 301 $scheme://mysite.com$request_uri; Example: server { server_name www.mysite.com; return 301 $scheme://mysite.com$request_uri; } https://support.cloudflare.com/hc/en-us/articles/218411427-Page-Rules-Tutorial https://support.cloudflare.com/hc/en-us/articles/200170536-How-do-I-redirect-all-visitors-to-HTTPS-SSL- https://kinsta.com/knowledgebase/redirect-http-to-https/ Hope this helps, Tom PS (I really wish there was a way for this tool to save a draft because I just wrote a very long reply and it's gone. Either way here is a shorter version of what I had written before.) hJz4w1o.png
Link Building | | BlueprintMarketing0 -
Company with multiple services | multiple locations/states
Hi Ryan, Complex scenario, but the good news is - you don't have to reinvent the wheel on this. Look at a website like https://www.rotorooter.com/ to see how they are managing the fact that they've got 600 locations in North America. If your company is expanding nation-wide, you need some type of interface (a zip code search, map, etc.) to get clients from the homepage to their correct section of the website. I see no reason to use subdomains. They typically just complicate things. You can create a landing page for each location (or a section of several landing pages if you absolutely must), but the goal is to take the client directly from the homepage to the page that tells them everything they need to know about the location nearest them. If you go this route, I would advise: Ensuring you have a sitemap that links to all of the landing pages, just to ensure full crawling Avoid duplicate content on these pages as much as possible. Make them unique and useful. This article should help: https://moz.com/blog/overcoming-your-fear-of-local-landing-pages Be sure you're building out a full set of local business listings/citations for each location and that the company has a strategy in place for managing reviews on them. That should get you off to a good start!
Local Strategy | | MiriamEllis1 -
I ask for a refund.
Hi there, Sam from Moz's Help Team here! Could you please send an email over to help@moz.com, using the email address you used to register? Looking forward to helping further!
Technical Support | | samantha.chapman0 -
Html extensions
I wanted to reinforce with Martijn and Gaston had said. I would just have your web person 301 redirect the old URLs to the new URLs. I would add to keep an eye out for any stray URL's that you may have missed in the Search console and redirect those too. They tend to pop up 1-2 weeks after a site move. Thanks!
Technical SEO Issues | | JohnSammon0 -
How Can i check my new domain metrics
Hi there, You can use Moz's Link Explorer. As you are able to post this question, you sure have a pro account (or at least the free trial) you'd be able to see everything Moz's tool has to offer Here I've checked your domain: https://imgur.com/a/jLJHIW0 Hope it helps you. Best luck. GR PS: Hope you're not doing this post in link building purposes, you wont get any linkjiuce as its nofollow
Moz Tools | | GastonRiera0 -
What is the name of BrightEdge's crawler?
Hi there, If you can't find it online. You can always check on your log files. Ask your hosting provider or talk to your infrastructure team so as they can help you. What you need is 4 columns: date, user-agent, its IP and requested-url. Why IP is needed? Some crawlers try to hide under GoogleBot user-agent. Just take a few IPs and with a simple DNS lookup, you'll know which aren't google's. PRO tip: Windows, CMD console: nslookup [IP address] Linus, terminal: nslookup [IP address] Hope it helps. Best luck. GR
Online Marketing Tools | | GastonRiera0 -
Page not being ranked properly
Let's understand your case, first of all, there are many factors involved in ranking a website. Some of them are internal and others are externals. Based on your comments you want to rank your site Target Location: UK Target Keyword: Dog Breeds Target page: https://www.mypetzilla.co.uk/dog-breeds On page Optimization Score: 89 ( You can check this on Moz Pro > On-Page Grader) With this information, the first thing that you need to check is who is ranking for that keyword in that location, you can do that using several tools like Moz, Ahrefs, Semruhs or Ubersuggest. Moz Keyword Difficulty: 46 Which is not easy, not impossible but is for sure that is not easy Volume: 30K -70K Ubersuggest: Keyword Difficulty: 46 Volume: 90K Ahrefs: Keyword Difficulty: 56 Volume: 60K Based on what I see you're competing in a strong and competitive niche so the first conclusion that jumps to my mind is that you need to change your target keyword for that page. But let's keep digging in your problem. Now let's analyze your target page https://www.mypetzilla.co.uk/dog-breeds **Moz Link Explorer Report ** Page Authority: 32 Domain Authority: 31 Linking Domains: 42 Inbound Links: 1.5K Ranking Keywords: 1 But all this information is useless if you don't have something to compare it. On your Keyword Explorer Tool, you can see the top 10 websites ranking for that keyword and that location Keyword Explorer > Enter Dog Breed > SERP Analysis These are the first 3 results, but I will strongly suggest you that check for your self all the reports in your own **Moz Pro Account ** 1- https://dogtime.com/dog-breeds/profiles Page Authority: 50 Domain Authority: 70 Linking RDs To Page: 199 Linking RDs To Root Domain: 18,393 2- https://www.petwave.com/Dogs/Breeds.aspx Page Authority: 44 Domain Authority: 54 Linking RDs To Page: 129 Linking RDs To Root Domain: 4,642 3 - https://www.purina.com/dogs/dog-breeds Page Authority: 44 Domain Authority: 64 Linking RDs To Page: 51 Linking RDs To Root Domain: 12,564 As you can see there is no a big mystery here, in fact, is very simple You are not competitive enough These are my suggestions Just face the truth, you can't compete with what you have right now mainly because your site is too weak Change your target keywords. Look for long tail keywords with lower competition level such as dog breed finder Run a link building campaign, and I'm not talking about bu..-shit links that you can buy on Fiverr or some bu..-shit PBN links that can rent or buy for pennies. Cheap Links=Cheap results Optimize your technical SEO, I mean you really need to take care of that. Just to give you an idea on a quick audit I found 4,230 results on your internal pages to build internal links using dog breed anchor pointing to your target page I hope this information answers your question if you consider my answer helps you don't forget to mark it as a Good Answer Regards and have a great day
Intermediate & Advanced SEO | | Roman-Delcarmen0 -
Search visibility is not shown in "Manage your campaigns"
Hi Eli, I wrote to help@moz.com so you could have a closer look. BR, Predrag
Other Questions | | Predrag.Zivkovic0 -
Can I view Domain Authority stats for longer than the previous 12 months?
Thank you I responded via email. Since you have improved the data so much, does this mean that going forward you can look at legacy data, ir will it continue to be limited to 12 months. I am having to track my own data so that I do not lose it. Chris
Getting Started | | cptutty0 -
Do long UTM codes hurt SEO?
The correct way to use UTM tracking without hurting your SEO efforts is to make sure you’ve implemented your canonical tags correctly. You should add self-referring canonical tags, which will prevent multiple versions of the same page from being indexed. For example: http://www.yoursite.com/some-page?utm_source=facebook&utm_medium=social should have a canonical tag that looks like this: If you have pages with these parameters on your site then you should use the rel=”canonical” tag to specify the canonical URL that you’d like Google to rank. I hope this information answers your question. if you consider that my answer is good enough don't forget to mark it as a Good Answer Regards and have a great day
Technical SEO Issues | | Roman-Delcarmen0 -
Webshop landing pages and product pages
Hi Roman, Thanks for the answer. I go into this question again, since just adapting the taxonomy and possibly adding some tags doesn't resolve the problem. It improves the UX, and creating a new structure was part of the planning anyway. I just think we will be missing out on a lot of traffic because there are a lot of high volume keywords with low difficulty that are only applicable for the product itself (and not for a category, subcategory or tag). There will be a copywriter assigned to write descriptions for every product anyway, exactly because of the fact that these products are so specific and need more explanation. If we take the keywords for a specific product and integrate them in the product description, I think we can surely rank with these product pages. Do I see this wrong? Thanks!
Intermediate & Advanced SEO | | Mat_C0 -
Standardising of Company Name Across The Web Question
It would be very unlikely that using "and" instead of "&" throughout your site would confer any advantage and it might confuse your customers, if that's not the way they are used to seeing your name. So stick with the & you use in your branding. As always when these types of questions come up and you are not sure of the answer, go with whatever is the best user experience and you are likely to be rewarded. (Also, take a look and some big sites like Lord&Taylor and see what they do—they don't give up their branding just because there is "and" in their URL.)
Branding / Brand Awareness | | Linda-Vassily0