Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
What does "Optimal Use of Keywords in Header Tags" really mean?
You were right! Thank you! What confused me was I thought it was saying that the keyword was found multiple times in multiple H! tags, but what it's really saying is that there are multiple h1 tags period. You rock, thanks!!
Other Research Tools | | bizmarquee1 -
Huge drop in rankins, traffic and impressions after changing to CloudFlare
This can involve many factors Is the new website on a dedicated IP and SSL? Did all of the URLs stay the same? Did any new Link Farms or SPAM links get added? Was all of the content transferred? ALL Same meta data and internal linking as well? Did the client happen to change addresses?
Technical SEO Issues | | WebMarkets0 -
Local SEO - 'Near me' phrases
Thanks, Andrew - but I must admit, i'm totally confused as to what I use and where! Presuming I want to show in local near me searches for 1 service, do I need to add each address to the service web page in question?
Intermediate & Advanced SEO | | Jack_Jahan1 -
Site Crawl Status code 430
Which, of course, you can't do in Shopify. Maybe we should just collectively get on Shopify to implement this by default.
Technical Support | | kentcclark0 -
I'm wondering if reviews services like yotpo and reviews.io are worth it.
OK so the review sites are certainly great for improving your click through rate and influencing your customers etc, but do not assume they will benefit you totally. Review sites (see list of google approved partners here) only automatically benefit Google adverts (CPC) by default and you will need a minimum number to qualify for the star rating. You may get a boost from a high traffic and quality review site organically but like Garrett says - opt for a review site that meets your needs and is also frequented and trusted by your users. Should you wish to benefit from star ratings in the organic results you will still need some technical knowledge to apply aggregate ratings schema to your site and even then it is not guaranteed they will show.
Reviews and Ratings | | TimHolmes0 -
Domain Authority hasn't recovered since August
Thanks for your responses Maureen From what I know, sometimes when you alter your site to be 'faster', you sometimes have to wait a few days for that to start reflecting in the page-loading speeds. I am pretty sure that, if you have server-side caching enabled, and resources have been cached (previously) non-compressed, then sometimes the old resources will continue being served to people for days (or even weeks) after alterations are made This is certainly true of image compression (where the old JPG / PNG files continue to be served after being replaced with more highly compressed versions, since the cache has not refreshed yet) - I am unsure of whether that applies to GZip compressed files or not (sorry!) From what I understand, page-speed optimisation is not a straightforward, linear process. For example many changes you could make, benefit 'returning' visitors whilst making the site slower for fir-time visitors (and the reverse is also true, there are changes which take you in both directions). Due to these competing axioms, it's often tricky to get the best of both. For example, one common recommendation is to get all your il-line (or in-source) CSS and JS - and place it in '.css' or '.js' files which are linked to by your web pages Because most pages will call in the 'separated out' CSS or JS files as a kind of external common module (library), this means that once a user has cached the CSS or JS, it doesn't have to be loaded again. This benefits returning site-users. On the flip-side, because external files have to be pulled in and referenced on the first load (and because they often contain more CSS / JS than is needed) - first time users take a hit. As you can see, these are tricky waters to navigate and Google still doesn't make it clear whether they prefer faster speeds for returning or first-time users. In my experience, their bias floats more towards satisfying first-time users Some changes that you make like compressing image files (and making them smaller) benefit both groups, just be wary of recommendations which push one user-group's experience at the expense of another For image compression, I'd recommend running all your images (download them all via FTP to preserve the folder structure) through something like Kraken. I tend to use the 'lossy' compression algorithm, which is still relatively lossless in terms of quality (I can't tell the difference, anyway). Quite often developers will tell me that a 'great' WordPress plugin has been installed to compress images effectively. In almost all cases, Kraken does a 25%-50% better job. This is because WP plugins are designed to be run on a server which is also hosting the main site and serving web-traffic, as such these plugins are coded not to use too much processing power (and they fail to achieve good level of compression). I'm afraid there's still no substitute for a purpose-built tool and some FTP file-swapping :') remember though, even when the images are replaced, the cache will have to cycle before you'll see gains... Hope that helps
Getting Started | | effectdigital0 -
The New and Improved Domain Authority Is Here!
I think its really important to keep up with Google and try close to changes. DA was needing an update
API | | salumep21 -
Search Console Click Through Results
No problem, thanks a lot for your advice and if I get an answer I'll be sure to post back on here for you!
Search Engine Trends | | FunktionEvents1 -
Duplicate titles from hreflang variations
Aha I see! That makes some sense. If the products are 'branded' and therefore the name never changes in any language, you have two options Let's imagine you are selling a branded air conditioning unit, with the made-up name of GreenAir (maybe it's more economical and uses less electricity, thus the name from the 'green movement') You could just leave it duplicate: EN: GreenAir | GreenWave Solutions FR: GreenAir | GreenWave Solutions Or you could add more contextual info, which would be better: EN: GreenAir Environmental Air Conditioning Unit | GreenWave FR: GreenAir Unité de Climatisation Environnementale | GreenWave I know, I know - my French sucks (actually that's from Google Translate). But still, you can see that - you could add more in there. The hurdle for you will be, what is required in terms of costs to deploy to that level of complexity? From a straight-up SEO POV, I stand by my preference. But once mass translation work is factored and targeted, dev-based implementation... you may feel otherwise!
Technical SEO Issues | | effectdigital0 -
Search ranking for a term dropped from 1st/2nd to 106th in 3 months
Thanks for the info! It's good to get a bigger picture of the nefarious 'globe' network which seems to link to every site on the entire internet, with absolutely zero value-add whatsoever for end users. It's interesting to see that you guys got hit by some variants of that pure-spam domain, which didn't seem to hit us. Clearly the problem is far more widespread than we had at first anticipated We also disavowed a whole load of non-globe related domains, those weren't in our export What I'm talking about in terms of the 'targeted' methodology, is not the deployment of the disavow - but the decision making process before the disavow file was compiled. We really made sure that, we got a very granular view of each and every link before deciding whether to disavow or not. We had rows of metrics against each link, before we decided whether to keep or disavow any particular link In almost all situations, once we reached deployment we used to domain-level disavow directives. There were only 1-2 exceptions, where the client had good editorial pieces on a site - yet also spammy banner / sidebar links from paid advertising. In such situations we used a mixture of disavow directives, to try (as hard as we could) to let to good links through the net. That being said, very few people will be in that same situation. In the majority of cases, if you don't want one link from a domain - you don't want any!
White Hat / Black Hat SEO | | effectdigital0 -
Hi, on SEO article submissions, do I only include the link to the page I am trying to promote or is it best practice to also include a link to home page or parent page?
Hello Dear! Yes, you can add links to your article but not doing stuffing and bolding so many keywords if you do then your website will not visible on SERPs. Because it has a way you did spam. For Example: Your article is all about SEO Services So You go to create a link that you are promoted like this SEO Services in Lahore
Content & Blogging | | ed52250 -
Duplicate content in Shopify - subsequent pages in collections
The advice is no longer current. If you want to see what Google used to say about rel=next/prev, you can read that on this archived URL: https://web.archive.org/web/20190217083902/https://support.google.com/webmasters/answer/1663744?hl=en As you say Google are no longer using rel=prev/next as an indexation signal. Don't take that to mean that, Google are now suddenly blind to paginated content. It probably just means that their base-crawler is now advanced enough, not to require in-code prompting I still don't think that de-indexing all your paginated content with canonical tags is a good idea. What if, for some reason, the paginated version of a parent URL is more useful to end-users? Should you disallow Google from ranking that content appropriately, by using canonical tags (remember: a page that uses a canonical tag cites itself as non-canonical, making it unlikely that it could be indexed) Google may not find the parent URL as useful as the paginated variant which they might otherwise rank, so using canonical tags in this way could potentially reduce your number of rankings or ranking URLs. The effect is likely to be very slight, but personally I would not recommend de-indexation of paginated content via canonical tags (unless you are using some really weird architecture that you don't believe Google would recognise as pagination). The parameter based syntax of "?p=" or "&p=" is widely adopted, Google should be smart enough to think around this If Search Console starts warning you of content duplication, maybe consider canonical deployment. Until such a time, it's not really worth it
Intermediate & Advanced SEO | | effectdigital0 -
Control indexed content on Wordpress hosted blog...
That almost looks like... your client doesn't have WordPress actually installed on their sub-domain at all. It looks like they set up a 'something.wordpress.com' site, which WordPress actually hosts - and somehow overlayed their own sub-domain over it (using DNS / name-server shenanigans) If that is true then, since WordPress hosts the blog, there's not much you can do. If it is a local WordPress install that does exist on your client's actual website instead of being 'framed' in (or something shady like that) - then I haven't seen this error before and it seems really odd. It smacks of someone trying to cut corners with their hosting environment, trying to 'be clever' instead of shelling out for a proper WP install. Clearly there are limitations... Ok, there's only one other alternative really. This is also technical though and I don't know if it wold be any easier for your dev guys but... You can send no-index directives to Google without altering the site / web-page coding, as long as you are willing to play around with the (server-level) HTTP headers There's something called X-Robots which might be useful to you. You need to read this post here (from Google). You need to start reading from (Ctrl+F for): "Using the X-Robots-Tag HTTP header" As far as I know, most meta-robots indexation tag directives, can also be fired through the HTTP header using X-robots It's kinda crazy but, it might be your only option
Technical SEO Issues | | effectdigital0 -
I want to use a domain that has previously been forwarded elsewhere. Any considerations?
Yes, analyze the links pointing to that domain and verify there arent many SPAM links. Also, a link reclamation campaign will most likely be needed for brand mentions, which can be very time consuming.
Technical SEO Issues | | WebMarkets0 -
Trailing slash URLs and canonical links
Hi Robert. I will get the code checked and most probably set that redirect rule indeed. Many thanks for the advice!
Technical SEO Issues | | GhillC0 -
Hide messenger for crawlers
In general, I don't think that this is a great idea. Although Google does meter out crawl-allowance, Google also wants a realistic view of the pages which it is crawling. Your attempt at easing the burden of Google's crawl-bots may be seen as an attempt to 'fake' good page-speed metrics, for example (by letting Google load the web-page much faster than end users). This could cause some issues with your rankings if uncovered by a 'dumb' algorithm (which won't factor in your good intentions) Your efforts may also be unrequired. Although Google 'can' fire and crawl JavaScript generated elements, it doesn't always do so and it doesn't do that for everyone. If you read my (main) response to this question, you'll get a much better idea of what I'm talking about here. As such, the majority of the time - you may be taking on 'potential' risk for no reward Would it be possible to code things slightly differently? Currently you state that this is your approach: "This means that we are actively adding javascript code which will load the Intercom javascript on each page, and render the button afterwards" Could you not add the button through HTML / CSS, and bind a smaller script to the button which then loads the "Intercom javascript"? I am assuming here that the "Intercom javascript" is the large script which is slowing the page(s) down. Why not load that script, only on request (seems logical, but also admit I am no dev - sorry)? It just seems as though more things are being initiated and loaded up-front than are really required Google want to know which technologies are deployed on your page if they choose to look, they also don't want people going around faking higher page-speed loading scores If you really want to stop Google wasting time on that script, your basic options would be: Code the site to refuse to serve the script to the "googlebot" user agent Block the script in robots.txt so that it is never crawled (directive only) The first option is a little thermonuclear and may mean you get accused of cloaking (unlikely), or at the least 'faking' higher page-speed scores (more likely). The second option is only a directive which Google can disregard, so the risks are lower. The down-side is that Google will pick up on the blocked resource, and may not elevate your page-loading speed. Even if they do, they may say "since we can't view this script or know what it does, we don't know what the implication for end-users is so we'll dampen the rankings a little as a risk assessment factor" Myself, I would look for an implementation that doesn't slow the site down so much (for users or search-bots). I get that it may be tricky, obviously re-coding the JS from Intercom would probably break the chat entirely. Maybe though, you could think about when that script has to be loaded. Is it really needed, on page-load, all the time for everyone? Or do people only need that functionality, when they choose to interact? How can you slot the loading of the code into that narrow trench, and get the best of both worlds? Sorry it's not a super simple answer, hope it helps
Technical SEO Issues | | effectdigital0 -
Local Recreational Marijuana Dispensaries Disappearing from Google Maps when Plurals used.
Yes, I was able to contact Google Support via Google My Business. They somehow fixed it on their end. However, I still see the same problem with similar searches. For example Google (Pot Stores Tacoma Wa), and then Google (Pot Store Tacoma WA). On the first search, only three businesses show, on the second search ALL the businesses show. I am googling from out of state so distance and centroid shouldn't affect the SERPS>
Local Listings | | isenselogic0