Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Central Index anchor text
To make matters worse I can't remove any listings because I can't login to Central Index because the whole process is being managed by Moz Local. I think i'll also talk to support over this.
Local Listings | | MickEdwards0 -
Schema.org usage when there is no specific value available
I would imagine that removing the "description" element would follow the logic of markup, but it is likely immaterial whether you choose to remove it or not.
On-Page / Site Optimization | | Brandon. 00 -
Multi-Regional Site URLs
Ooh, getting complex! "USA is our main market, but it would have the URL .com/us/en/. Do you think that would be a problem" - that should be fine. "We're using /yy/xx/ (country/language), divided in folders. Does it impact SEO? Would you recommend use hyphen instead?" - it's fine the way it is; as long as country is first (so that you can use geo-targeting for each country level folder in Google Search Console) "In this case, if I'm browsing the home page in the USA (.com/us/en/) the canonical would reference the default home (.com)?" - Nope, and be careful here; the canonical tag should reference the correct URL for the current country/language version (in this case, .com/us/en/). You want each country to have/manage its own value. You could attempt to try and do some clever stuff by artificially canonising certain pages to specific countries; however, that'll impact the indexation of those pages in their respective countries, and you'll end up sending people to the wrong versions. "[...] If I visit .com/us/en/ from Australia I stay in the USA version, and get a notification asking me if I want to go to the Australian version. Does it make sense?" - Yup, perfect Your last point on hreflang; it may be that you have some configuration errors, but it may also be that Google believes that for a given query, for a given user, that it's better to serve them the incorrect geographic result (perhaps they're exhibiting strong purchasing behaviour, and the product is unavailable in their territory; or that an unavailable product page is resulting in poor user signals and harming that page's performance)? I'd definitely do some digging into your hreflang configuration to rule that out first, though!
Local Listings | | JonoAlderson0 -
Geolocation issue: Google not displaying the correct url in the SERP's
Hi Steban, just curious: why have you chosen 301 redirects to redirect to the specific countries. It seems to me that you should be using 302 redirects instead.
International Issues | | ederdesign0 -
What's the Best Strategy for Multiregional Targeting for Single Language?
The correct answer is Quit geo-targeting in GSC Implement hreflang annotations. Their same implementation will avoid the risk of Google considering the "duplicate" versions as duplicate DON'T CANONICALIZE ALL THE VERSION TO ONE YOU CONSIDER CANONICAL. Doing that will screw all the hreflang implementation and the other countries will always see the canonical Url (for instance the US one in the UK). Instead, work on canonicalization but version by version as if they were (and actually they are) different websites. This means self-canonicalization and/or canonicalization toward another url in case, you know, of parameters et al Try the most you can to localize the different versions of English you're using. This will improve the localization signals for Google (and will be appreciated by your users). However, if you cannot afford to do that, you're still safe because of the hreflang. Remember that the href of the hreflang annotations must always present a canonical Url. So, if you implement the hreflang in a canonicalized Url, its href will need to present the canonical url of the canonicalized page the hreflang is being implemented. If you don't do this, you will see "no-return" error in Search Console, and Google won't consider your hreflang implementation and, yes, it will start considering your versions duplicated content.
International Issues | | gfiorelli12 -
What to do when half of my pages aren't being viewed?
I like what Chris said here. If your goal is to get more pageviews on these deeper results, then removing the phone and address and only having it on the restaurant listing page may get another click. However, your main issue in getting users to stay on the site longer is still present. What's the #1 thing other people want to know when thinking about going to a restaurant? "Is the food any good?" Users will still probably try and leave to find reviews on YELP or Google. Maybe trying to include those valuable things will keep users on. Also, I've noticed your Google Analytics tracking script is in the footer. I know this topic is debated, but you're probably losing out on some data for users who bounce before that script loads. This may make things look worse and increase your bounce rate, but it will give you a more accurate representation of how users are interacting with your site.
Technical SEO Issues | | ccox10 -
Bad Google Reviews - Should I Remove the Map From My Website?
Glad to answer such a great question as the one you've asked, Blue Corona. Really a good topic you've brought up here!
Local Listings | | MiriamEllis1 -
Two Companies Merging - Impacts on SEO
Hi Brandon, Thanks for taking the time to answer! We appreciate the response and will take what you said under consideration.
Intermediate & Advanced SEO | | BlueCorona1 -
If we are a local based business, what is the best approach to tracking keywords? Shall we be micro tracking?
I would track whatever is relevant to your business. Are those villages holding valuable customers? If you rank high for these villages then your exposure could be much greater. It is also easier to rank for small local areas then large geographical ones. With that said if you track just the 3 major towns chances are you are ranking similar to those smaller villages. It all depends on how much info you want and what is necessary to improve your SEO strategy.
Local Website Optimization | | donsilvernail0 -
Managing negative keywords when multi ad groups trigger for same keyword
Hi, As far as I know There is no such tools available. I would suggest you to use Adwords Editor. You can also contact Adwords support team for available methods to do the same. Hope this helps
Paid Search Marketing | | Alick3000 -
Do the header tags must be placed from top to bottom order?
I would say that if the header structure is not adhering to a logical hierarchy, then the headers don't have as much integrity intuitively speaking. It could follow that the headers are not valued as they were intended.
Search Engine Trends | | Brandon. 01 -
I'm doing a link audit and I want to download a spreadsheet of all the links to my website in OSE. There has to be a way to do that right?!
Hiya! Kristina from Moz's Help Team here. Unfortunately, a limitation of our API is that we can only export the top 25 links from each linking root domain. This means that, if the majority of your links are coming from a small number of domains, we would only be able to export a small percentage of your links. This includes internal links as well as external links. For example, moz.com has 61k internal links, but we would still only be able to export the top 25 of those internal links. The reason why is that if someone has 500,000 links and most of them are from the same domain, then we wouldn't want to fill up all 10,000 links with the same domain information and leave out important links from other domains. Instead, we'll only show the top 25 based on Page Authority so that we can show links from other sites without the same domain taking up all 10,000 spaces. In other words - we want to show link diversity. The advanced reports have the same limitations on the number of links per linking root domain, and can only give links from an increased number of root domains. If you want to obtain more than 25 links from a particular domain (we export only 25/per domain) to a target, this is only achievable through the Mozscape API directly. For example, if you want to find the number of pages that link to the yahoo.com root domain from the wikipedia.org root domain, you could issue the following query: http://lsapi.seomoz.com/linkscape/links/yahoo.com?Scope=page_to_domain&SourceDomain=wikipedia.org&SourceCols=4&Limit=50&Offset=0 This will return the source URL of the first 50 pages from anywhere in the wikipedia.org root domain to anywhere in the yahoo.com root domain. Then, you can set the Offset parameter to 50 and run the same call, and so on, always incrementing Offset by 50. As soon as you get a number of links back that's less than 50, you know you've reached the end of the links we have that fit your parameters. Just add up all the links you've seen, and you've got your count! Please note the limit for free users will be 1000 links total. There is no limit with a paid subscription, however higher offsets take longer to process and may time-out. I hope this helps - do let me know if there's anything else I can assist with! And as always, you can reach out to our team directly any time by emailing help@moz.com or clicking the blue chat icon on the lower right of the product! Thank you, -Kristina
Link Explorer | | KristinaKeyser0 -
Product search URLs with parameters and pagination issues - how should I deal with them?
Hi Zack, Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
Intermediate & Advanced SEO | | LoganRay0