Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Client wants to delete Google My Business Due to Bad Review
You can delete it from your GMB data. Try to remove all the data of the business and details. just like I did for Bud apk program.
Reviews and Ratings | | Njnbiure45r41 -
How does a page with a canonical for another domain impact SEO?
i see, that makes sense. We will proceed with adding the canonical as mentioned. Thanks again
Intermediate & Advanced SEO | | KendallHershey0 -
Aggregate Rating Markup for Local Restaurant
I'm a local dentist and have the same problem. So we have 188 x 5 star reviews. They are appearing next to every blog post I write and every services page but not the homepage. Having those yellow stars in the serps and on the map is HUGE and tyou can get them just not for the homepage. If you're on wordpress theres a plugin called Google reviews for business that works really well for us. So all your team pages, about pages, blogs and menus can have the stars - just not your homepage. So you could create individual pages for each dish or meal if you like and have stars next to them. So as a dentist I have a braces services page, an Implants one, a cosmetic one and a whitening one, all with reviews in the map, on the page as a slider and in the blogs. So don't be defeated, you can do it really easily with the plugin!
Reviews and Ratings | | Smileworks_Liverpool0 -
HREFLANG for multiple country/language combinations
Hi Sam, Apologies for the slow response. Your question slipped through the net. This is an interesting case! In an ideal world, you'd specify the relationship between all of those pages, in each direction. That's 150+ tags per page, though, which is going to cause some headaches. Even if you shift the tagging to an XML sitemap, that's a _lot _of weight and processing. Anecdotally, I know that hreflang tagging starts to break at those kinds of scales (even more so on large sites, at that kind of scale, when the resultant XML sitemaps can reach the size of many gigabytes, or when Google is crawling faster than it's processing the hreflang directives), and so tagging everything isn't going to be a viable approach. I'd suggest picking out and implementing hreflang for _only _the primary combinations*, as you suggest, and reducing the site-wide mapping to the primary variant in each case. You might consider that there may be cases where the valuable/primary combinations aren't just the /xx/xx/ or _/yy/yy/ _versions and that there might be some examples of varying country/language combinations which are worth including. For the atypical variants, I think that you have a few options: Use meta robots (or x-robots) tags to set noindex attributes. This will keep them out of the index, but doesn't guarantee that you're effectively managing/consolidating value across near duplicates - you may be quietly harming performance without realising it, as those pages represent points of crawl and value wastage/leakage. Use robots.txt to prevent Google from accessing the atypical variants. That won't necessarily stop them from showing up in search results, though, and isn't without problems - you risk you creating crawl dead-ends, writing off the value of any inbound links to those pages, and other issues. You use canonical URLs on all of the atypical variations, referencing the nearest primary version, to attempt to consolidate value/relevance etc. However, that risks the wrong language/content showing up in the wrong country, as you're explicitly _un_optimising the location component. I think that #1 is the best approach, as per your thinking. That removes the requirement to do anything clever or manipulative with hreflang tagging, and fits neatly with the idea that the atypical combinations aren't useful/valuable enough to warrant their own identities - Google should be smart enough to fall back to the nearest 'generic' equivalent. I'd also take care to set up your Google Search Console country targeting for each country-level folder, to reduce the risk of people ending up in the wrong sections.
Search Engine Trends | | JonoAlderson0 -
Best practices for types of pages not to index
Need to be clear on the purpose of "no-index". Search engines will still crawl the page, but in theory will not be published in the index. Some search engines may still choose to index the page despite no-index tag. Also that page will still be publicly accessible on your website. As already noted a couple of times I would be very slow to noindex any page. I can't think of very many applications where it would be used. The way I view it is either something is public or its private, if it's public you properly want search engines to find it, or if it's private it should be locked away behind a username and password.
Technical SEO Issues | | seoman100 -
Google Indexing Stopped
Hi Jeffrey, Hard to draw conclusions without knowing the website. Is the site actually 2.3 million pages or did Google deindex a bunch of duplicate or thin content? I feel this could be what is happening. Are you seeing a decrease in organic traffic and/or rankings since this happened or were these thin-content pages that were not generating any traffic? An average load time of 4 seconds is likely not the issue. It's not ideal for UX (particularly on mobile) but I have seen Google fully index websites with higher average page load times than this. Were you getting these index numbers from Google Search Console or from performing a site search? ("site:") If you want to DM me the site, I can take a closer look for you. Joe
Intermediate & Advanced SEO | | Joe_Stoffel0 -
Does the Moz Pro site crawl, crawl password protected sites?
Hi there! Tawny from Moz's Help Team here! Typically requiring a password to access certain pages on the site will prevent our crawler from reaching them, but if those pages are linked to somewhere else on your site, it's possible for our crawler to still find those pages. This sounds like it might be pretty specific to your Campaign, so I would encourage you to contact us at help@moz.com and we'll do our best to sort through this with you. Thanks!
Link Explorer | | tawnycase0 -
What happens if I structurally uploaded two different images under the same name? Will Google penalise me for it?
I agree with seoman here, not something that will get you penalized but certainly opportunities to better optimize your images.
Vertical SEO: Video, Image, Local | | Joe_Stoffel0 -
Refferal links
It sounds like this wouldn't be an issue (see my reply below) but if Heinrich says different, I'll reply again
Link Building | | Paddy_Moogan0 -
Hreflang made simple
In fact, what the client really wants is this... ... any user in the UK who: clicks on a link from social media (which goes to domainname.com) searches google for domainname.com types in the URL domainname.com ... ALWAYS be redirected to "domainname.co.uk" so they only see what the client sells direct from the UK. This sounds to me way beyond just hreflang. This sounds like it might need some kind of redirection based on IP?
International Issues | | muzzmoz0 -
How does google decide who to rank 1 st ?
Hi SEO Analytics, You can start understanding how Google ranks websites by reading Moz's Beginner's Guide to SEO. Once you've finished reading it, read it again. Cheers, David
Intermediate & Advanced SEO | | davebuts0 -
Will using display:none; to hide images hurt SEO?
I would be surprised if this is really going to hurt SEO. It is a tactic that might be a little bit in the grey area but if this is just 1 thing that you're 'messing' up and not doing the right way it would definitely not be something that they're going after.
Social Media | | Martijn_Scheijbeler0 -
Disavow 401, 403, 410, 500, 502, 503
I am still trying to decipher rather you're talking about them pointing to your site that resolves in these codes or if when you query the URLs or domains they are giving you these codes: 401, 403, 410, 500, 502, 503I still think that I need to see a little bit more information however if you are cleaning house and you have backlinks that resolve to dead domains and 404's etc. Please remember anyone can make these domains and URLs live anytime. so it is best to err on the side of caution and disavow spamming backlinks. whether or not they're coming from dead / 500 backlink or hitting your site with an unauthorized 401.( I have placed what the codes mean below in hopes that it may be of some help to you)Successful 2xx 200 Status OK 201 Created 202 Accepted 203 Non-Authoritative Information 204 No Content 205 Reset Content 206 Partial Content Redirection 3xx 300 Multiple Choices 301 Moved Permanently 302 Found 303 See Other 304 Not Modified 305 Use Proxy 306 (Unused) 307 Temporary Redirect Client Error 4xx 400 Bad Request 401 Unauthorized 402 Payment Required 403 Forbidden 404 Not Found 405 Method Not Allowed 406 Not Acceptable 407 Proxy Authentication Required 408 Request Timeout 409 Conflict 410 Gone 411 Length Required 412 Precondition Failed 413 Request Entity Too Large 414 Request-URI Too Long 415 Unsupported Media Type 416 Requested Range Not Satisfiable 417 Expectation Failed Server Error 5xx 500 Internal Server Error 501 Not Implemented 502 Bad Gateway 503 Service Unavailable 504 Gateway Timeout 505 HTTP Version Not Supported I hope this helps,Tom
Intermediate & Advanced SEO | | BlueprintMarketing0 -
Drop in Traffic
Lots of people have problems with redirecting old URLs to new URLs, making sure they are done properly, making sure that none were left behind, putting them in the right format and order in the .htaccess document.
On-Page / Site Optimization | | EGOL1 -
Reviews and Google - whats their relationship?
Thanks for taking your time to answer my questions. I will read that article now, I hope it will give more clarity on this topic. I greatly appreciate your help.
Reviews and Ratings | | Ruchy1