Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Is it worth getting a backlink to a page that will be 301'd?
Thanks Gaston! I am hoping the website can go live before the links are created, but doubtful. Either way, I think they're worth it, as the referring domains would be very high DA (Huffington Post, etc.) and I'm sure the pages will also have high PA.
Link Building | | TimThiel0 -
Clicks are the ultimate factor to stick the page on position?
Hi, Yes you are right CTR (click through rate) is important factor in SEO too. If users clicks on your site's link in SERPS then Google will think that your page is relevant and you will get better rank. Thanks
Search Engine Trends | | Alick3000 -
I want to load my ecommerce site xml via CDN
Hello Micey123, That sounds good except you should put the sitemap reference for xyz.abcd.com within that subdomain's robots.txt file as well: xyz.abcd.com/robots.txt, as each subdomain should have its own robots.txt file.
Technical SEO Issues | | Everett0 -
Handling Pages with query codes
Hi Richard These are parameters that sit after the main URL and often include 'sort' 'page'. (They can also be created in some eCommerce pages as 'products' but these should be dealt with a mod-rewrite to show properly constructed URLs with category name and title). There are a number of ways with dealing with them: 1. Google search console - you have to be very careful messing with the rules in parameter handling but for some, this is the way. 'sort' then you can tell Google that it narrows the content on the page - you can then choose to let Googlebot decide or block the URLs - I often block them as they just create skinny and duplicate content. Pagination - 'page' you can tell Google that this paginates and then let Google decide. Look at rel/prev tag on those pages as well. Attributes - like size and colour - I generally block those as they just create skinny duplicates of main categories Others - like Catalog - it depends on what platform you use but there could be other parameters being created - I block most of them as they create useless URLs 2. Robots.txt You can use this file to block the indexing of these pages depending on the parameter by excluding them from being followed by the search bots. Once again be very careful as you don't want to accidentally block indexing of useful areas the site. Your bskt page should be dealt with like this but read about all the other ones as well. https://moz.com/learn/seo/robotstxt 3. Canonicals If you are able a great way of dealing with attributes like size and colour is to canonicalize back to the non size specific URL - this is a great way of maintaining the link juice for those URLs which may otherwise be lost if you blocked them all together. You add a rel=canonical tag pointing to the non-parameter version. https://moz.com/learn/seo/canonicalization 4. As a last resort you can 301 redirect them but frankly, if you have dealt with them properly you shouldn't have to. It's also bad practice to have live 301 redirects in the internal structure of a website. Best to use the correct URL. There is more reading here: https://moz.com/community/q/which-is-the-best-way-to-handle-query-parameters https://moz.com/community/q/do-parameters-in-a-url-make-a-difference-from-an-seo-point-of-view https://moz.com/community/q/how-do-i-deindex-url-parameters Regards Nigel
Technical SEO Issues | | Nigel_Carr1 -
Does Google push down for not ranking top for branded keywords?
Hi vtmoz You should definitely have pages that target brand related queries and this is not spam. I don't know what you sell. But you should have 'landing pages' I prefer to call them 'categories' for all the 'things' 'services' or 'product categories' your company covers. Make sure that the pages are tight and succinct, have 300-1000 words content and are highly contextually targetted to the different categories you cover. This is not spam, it is normal site structure and you will not be penalised. Regards Nigel
Search Engine Trends | | Nigel_Carr0 -
Menu interlinking Pages
Hi A site wide menu at the top is essential. If you don't have links to, say, sub categories in that menu, then have a side menu with the lesser (lower) pages in there. Make sure they are linked through the anchor text you deem to be the most searchable term. Yes, Google 'counts' - (follows and uses it to help index for search terms) the menu links. The menu is either a structural set of links at the top on the website on every page. Or a list of links in the left or right-hand side of the page on certain pages. I fear there may be a slight misunderstanding of terminology & semantics here as I have tried to be very clear, Regards Nigel
Intermediate & Advanced SEO | | Nigel_Carr0 -
What is the best way to treat URLs ending in /?s=
Hi Alex These are parameters that sit after the main URL and often include 'sort' 'page'. (They can also be created in some eCommerce pages as 'products' but these should be dealt with a mod-rewrite to show properly constructed URLs with category name and title). There are a number of ways with dealing with them: 1. Google search console - you have to be very careful messing with the rules in parameter handling but for some, this is the way. 'sort' then you can tell Google that it narrows the content on the page - you can then choose to let Googlebot decide or block the URLs - I often block them as they just create skinny and duplicate content. Pagination - 'page' you can tell Google that this paginates and then let Google decide. Look at rel/prev tag on those pages as well. Attributes - like size and colour - I generally block those as they just create skinny duplicates of main categories Others - like Catalog - it depends on what platform you use but there could be other parameters being created - I block most of them as they create useless URLs 2. Robots.txt You can use this file to block the indexing of these pages depending on the parameter by excluding them from being followed by the search bots. Once again be very careful as you don't want to accidentally block indexing of useful areas the site. https://moz.com/learn/seo/robotstxt 3. Canonicals If you are able a great way of dealing with attributes like size and colour is to canonicalize back to the non size specific URL - this is a great way of maintaining the link juice for those URLs which may otherwise be lost if you blocked them all together. You add a rel=canonical tag pointing to the non-parameter version. https://moz.com/learn/seo/canonicalization 4. As a last resort you can 301 redirect them but frankly, if you have dealt with them properly you shouldn't have to. It's also bad practice to have live 301 redirects in the internal structure of a website. Best to use the correct URL. There is more reading here: https://moz.com/community/q/which-is-the-best-way-to-handle-query-parameters https://moz.com/community/q/do-parameters-in-a-url-make-a-difference-from-an-seo-point-of-view https://moz.com/community/q/how-do-i-deindex-url-parameters Regards Nigel
Moz Tools | | Nigel_Carr0 -
My backlinks are not showing in webmaster tools? Why
Hey I am also facing this issue for my news website and I am working on it from last 2 months but still, none of my backlinks are indexed properly you can check my website blockcrux in Moz it has zero domains
Technical SEO Issues | | Eyekia0 -
I see some links on Youtube.com that look like they are "Do-follow" links am I wrong?
Hi Bill, I think this will help answer your question. The short answer is no, you cannot get a dofollow backlink from YouTube. I believe the Moz bar failing to properly register these 'comment' links as nofollow because they are javascript based and the Moz bar is not compatible with Javascript. The links that I see in the video descriptions are registering as 'nofollow' with the toolbar. Even the links on individual channel homepages are being stripped of SEO value due this 'rel= me nofollow' tag which acts the same as a standard nofollow tag [see screenshot]. I will note that all internal links (links to other youtube.com pages) will register as dofollow. But outbound links will be nofollowed. Hope this helps! MozBar tool doesn't highlight follow/no follow, what does that mean?
Moz Tools | | Joe_Stoffel0 -
Privacy policy page at the bottom of web
Like that in robots.txt Disallow: /my-private-terms.hmtl
Intermediate & Advanced SEO | | Agenciaseomadrid0 -
One of our clients has a ranking anomaly happening...
The Joe Stoffel answer is perfect, and also you can check at the bottom of the web results of google we google thinks is your position or city, it can change becouse google checks in different server your results. Maybe google changes your position in an incognito or stop from showing the city
Intermediate & Advanced SEO | | Agenciaseomadrid1 -
Our client changed their domain name. Can we replace without losing the data for the old domain in MOZ ? We are at our campaign limit would the old domain count as a campaign?
Hey there! Tawny from Moz's Help Team here. Unfortunately, there's no way to change the URL a campaign is tracking once the campaign is started. If you need to track a new domain, you'll need to archive or delete a campaign to make room for the new one. I hope that helps! If you still have questions, feel free to write in to us at help@moz.com and we'll do our best to help with any and all issues you might run into.
Technical Support | | tawnycase1 -
Keyword explorer
You're very welcome! Do feel free to reach out if you need any more help.
Intermediate & Advanced SEO | | andy.bigbangthemes0 -
Outreach - Guest Blogs & Articles
We publish all of this content on our own site. We don't syndicate, share, or guest post anything. Instead we promote the library of information on every page of our website, link to it in the persistent navigation, guide customers to it when they write to us for information, feature it in a newsletter that we send every month, and let industry groups know that it is available for viewing on our website. That's what we do. Our customers and site visitors then share it for us.
Social Media | | EGOL0 -
Google's stand on LSI keywords?
Hi vtmoz, It's very common for Google to change its algorithm in response to user trends. The thing to remember is that users have different forms of intent when they use certain words or phrases, and Google is constantly altering their ranking methods to reflect this. While LSI keywords tend to remain static, there can be alterations and shifts within the industry or in how people observe and interact with it. However, these changes would not necessarily occur because companies begin using them more frequently. More likely, Google is dipping into its reservoir of big data to determine that user queries leading to certain pages were not producing user satisfaction (i.e. bounce metrics were high on pages that were previously identified as supplying relevance via LSI keywords) and therefore made a change to better reflect what users were searching for (and interacting with) from their SERPs. A couple of questions to ask: Has anything changed within your industry that would cause an LSI keyword shift? (new products, new competitors, new rules and regulations, etc?) Is there a pattern in terms of the keywords that have changed? Is it industry-wide or a specific segment? Are there new ways users may be interacting with the industry? New queries being used? What is the impact on your rankings for general terms related to those LSI terms/phrases? Based on the answers to these questions, you can better identify whether it's a shift from Google altering the LSI algorithm for your industry, or simply an indicator of a developing industry. My guess is the latter. Hope this helps - feel free to reach out any time if you need a clarification or just want to chat! Thanks, Rob
Search Engine Trends | | RobCairns0