Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Are there any SEO issues we should be aware of on Gutenberg?
Hi Gutenberg is simply a back office editor. It has no impact on SEO. How you organise your web pages is up to you, whether you use teh classic editor (for which there is still a plugin) or use the new editor. Neither affects SEO as long as the on-page content and technical aspects are correct. Relevent Title tags On Page content Images All the usual stuff! Regards Nigel
Technical SEO Issues | | Nigel_Carr0 -
Business Split into 2 Businesses - Residential and Commercial Site - 2 different URLs and both have the same address! Can we create 2 separate Google My Business Accounts?
Justin, As a rule of thumb, you need to refer to Google's guidelines in representing your business. If your business entity(s) fit the following criteria, then you are safe. If not then I would not risk it. 1. Both have a separate LLC (or business entity) 2. There are separate offices (doors) for each business. 3. The Name of the businesses/Phone numbers are different (good to have suite different but not needed) 4. There is distinct signage for each business entity. Based on how I have heard this question phrased in the past, I think you may be just asking if having a separate domain is enough, and the answer would be no.
Local Listings | | Ben_Fisher0 -
Do product sub-categories compete with top level categories for rankings?
Hi Jason, Don't tear apart your subcategories into subdomains. It is much likely to perform poorly and cannibalize your search terms. I'm working in XXXL e-commerce where there are over 30 subdomains for each country. We found that this configuration isn't good enough to compete with other big players, on google search. So, we've run some test and found that moving from subdomain to subfolder into a bigger subdomain pays in traffic. That said, I'd focus more on checking what your main competitors are doing. There might be a lot of opportunities optimizing what content you are offering users. A tip, check how their and you are rendering for mobile views. There might be a low hanging fruit over there. Unless you are already doing that Hope that helps. Best luck. GR
On-Page / Site Optimization | | GastonRiera0 -
Site migration/ CMS/domain site structure change-no access to search console
If your architecture is changing, (e.g: from non-www to www, then from HTTP to HTTPS) - just be careful that your developer's logic doesn't start 'stacking' redirect rules You want to avoid this: A) user requests http://oldsite.com/category/information B) 301 Redirect to - http://newsite.com/category/information C) 301 Redirect to - https://newsite.com/category/information D) 301 Redirect to - https://www.newsite.com/category/information Keep your redirects **strictly origin to final destination, and you'll probably be ok! **In the case of my example the redirect should go straight from A to D, not from A to B (hope that makes sense) Install this Chrome extension so that you can see redirect paths in your Chrome extension buttons menu. It's very, very handy for testing redirects
Intermediate & Advanced SEO | | effectdigital0 -
Results After a Disavow File Submission
I gave a solid response here which is hugely likely to shed some light on your disavow predicament. Disavow work is a preventative measure, it is not work which you should 'expect' to raise rankings. If you didn't replace the disavowed (discredited) backlinks with decent ones, obviously you'll just go down and down
Link Building | | effectdigital0 -
301 Question - issue
It's probably just taking Google a while to process all the changes. Really your 301s should point to the same content, not just all go to the homepage. If you had pages showing on two sites, the pages do 'really' exist on one site but weren't supposed to exist on the other. Correct the 301s so that they point from the URLs on the affected site, to the exact same pieces of content on the site where they were originally located (where they were supposed to be located) If that fails use the HTTP header and X-robots (not no-index tags, fire the no-index directive from the HTTP header instead of the HTML) to tell Google not to index those URLs on the 'affected' website. In conjunction with that, alter the status code of all bogus URLs on the 'affected' site to 410, which is stronger than 404 (it means: GONE - not coming back, 404 just means temporarily gone but will return...)
Intermediate & Advanced SEO | | effectdigital0 -
Our original content is being outranked on search engines by smaller sites republishing our content.
Hi Andrew, this sounds like it could be a question of EAT (expertise, authority and trustworthiness) something that Google places a very high value on. In a nutshell, Google wants to rank domains that are proven to be experts in their field and they place much higher value on EAT then they do other classic SEO tactics like keyword placement. Heres a hypothetical: If you, a large media site that focuses on a variety of topics, published an article about how a vegetarian diet is beneficial for your health that was written by a journalist and not a doctor, it would have a relatively low level of EAT. However if my smaller niche site that only focuses on health related issues and publishes content from doctors, nurses, trainers ect republished that article, it would have a higher level of EAT. There are several ways to improve your EAT, if you have qualified people writing content about their areas of expertise you should let the world know. Tweet about them, include their names in articles and give them an "about us" or "about the author" write up on your site. You can read your fill about EAT in the Search Quality Evaluater Guidelines if you need more details. https://static.googleusercontent.com/media/www.google.co.uk/en/uk/insidesearch/howsearchworks/assets/searchqualityevaluatorguidelines.pdf
Technical SEO Issues | | Jason-Reid0 -
Ranking at 2 and 3 for a Desired Keyword - But Missing #1
Thanks for the help and ideas. I appreciate you taking the time to help me out
On-Page / Site Optimization | | photoseo10 -
JPG's listed as the Top Pages in Moz
We have this same issue. Surely the top pages should be (in our case) product pages or category pages, not all images?
Link Building | | funideas1 -
Anybody have experience with outreach tools? e.g. ninja outreach / Pitchbox
Our agency uses Buzzstream and I highly, highly recommend it. Definitely worth checking out.
Online Marketing Tools | | TaylorRHawkins0 -
Is it more beneficial to use Yext rather than doing the citations manually?
I think Yext is really worth the cost and has the most important local directories covered. One main advantage of services like Yext or Moz Local is the ease of making updates to the listings. With manual citations, if you need to make a small change to the hours or description of the business, you need to repeat the task for all the sites one by one. But with Yext or Moz Local, it can happen quickly by just updating the business under your Yext or Moz Local account while automation will take care of updating the rest of the listings in the background for you. There is also this business listing scan tool out there that I thought might be relevant to this question, it covers the most sites and also tells you what's your completeness score and your accuracy score with a short link that you can use to share the web-based report with others. I hope you find my answer useful.
Local Listings | | SaeedKhosravi1 -
Stuctured data for different sized packages
Hey! No problem.. Just trying to figure the best way to do this too! Thanks for the detail reply. All valid points - regarding indexing thin content, and showing customers more than 1 size - but those can be solved. Lets look at this with an actual example... Redbubble.com (an Alexa top 1000 website in the US) are selling a throw pillow in different sizes and different types. The costs are different based on the size and type chosen. This is their main product page for this product: _https://www.redbubble.com/people/straungewunder/works/25221192-familiar-sooty-owl?p=throw-pillow_ On this main product page they are sending the customer to a default size (16*16) and type (cover only) option.. But as it is a dropdown, the customer is not stuck with just 1 size - he/she can choose multiple from dropdown. And on this same page, they have this schema markup. ..... Then they have duplicate pages for all the other pricing options. E.g. for size (26*26) and type (cover only) - this is the URL _https://www.redbubble.com/people/straungewunder/works/25221192-familiar-sooty-owl?p=throw-pillow&size=26x26&type=cover-only_ and the schema markup is identical to the one list above, _except for the price. _ All these pages are all exactly similar except for the default size and type chosen, and therefore the price is different for each page. Duplicate pages are not a problem as they use canonical tags properly. All the pages have this canonical tag. The canonical tags point to the original page always. Regarding indexing the pages - **only the original page is indexed. ** If you go to Google and search for their main product url - it comes up on Google. If you go to Google and search for the other product pages with different pricing options - they are not indexed. So **Google isn't wasting crawl budgets on these duplicate pages.**But in your case you would index more pages if the search volume is high for different quantities (and then also change H1/title/meta tags respectively for these indexed pages). Also, updated this as a blog as I think more people have this problem and will find this useful. Apologies if you have already considered this, but let me know if this still doesnt work for you.. Interested to know what you finally go with!
Intermediate & Advanced SEO | | PaperTrail1 -
Truncated product names
If you had two different source codes served via user-agent (web-user vs googlebot) then you'd be more at risk of this. I can't categorically state that there is no risk in what you are doing, as Google operates multiple mathematical algorithms to determine when 'cloaked' content is being used - and guess what? Sometimes they go wrong That being said, I don't believe your risk of garnering a penalty is particularly high with this type of thing These are the guidelines: https://support.google.com/webmasters/answer/66355?hl=en You're in a really gray area because, you aren't serving different URLs - but you _could _be serving different content (albeit only slightly). I say 'could' rather than 'are' as it entirely depends upon whether Google (on any particular crawl) decides to enable rendered crawling or not If Google uses rendered crawling, and they take the content from their headless-browser page-render (which they can do, but don't always choose to as it's a more intensive crawling technique) then your content is actually the same for users and search engines. If however they just do a base-source scrape (which they also do frequently) and they take the content from the source code (which doesn't contain the visual cut-off) then you are serving different content to users and search engines Because you've got right down into a granular area where the rules may or may not apply conditionally, I wouldn't think the risk was very high. If you ever get any problems, your main roadblock will be explaining the detail of the problem on Google's Webmaster Forums here. Support can be very hit and miss
Technical SEO Issues | | effectdigital0 -
Apart from spying on competitors back link what else can be done in MOZ?
In a certain way, I understand the concern of Manifeat9 regarding his question, and Ely Myers answers with the usual, the general specifications of Moz. true is that you can get high value results, but it is like other tools such as "Semrush", I think the question is more focused on the additional added value, such as list of urls for link earning. In case I am paying 179 for the subscription I would like to have the added value, with effective and specific examples for my business. I do not know if the question is still specific enough. and if I'm wrong, please accept my apology.
Link Explorer | | ulhosting0