Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi there! If you haven't already, feel free to write into us at help@moz.com with your campaign details. We can then take a look at your campaign and offer potential suggestions. Thanks! Kevin Help Team

    Other Research Tools | | kevin.loesken
    0

  • Awesome Thanks so much for your input miriam. Very useful

    Local Listings | | coolhandluc
    0

  • Hey David, Sorry for the delayed response on this. Before we get started, I should point out that htaccess's syntax is very particular and you should be extremely careful when messing with it. even a single space out of place can cause massive errors. If you're planning changes, please consult with a developer or three on your team! I think the best way to explain this is to go through exactly what the htaccess Rewrite calls are doing. RewriteCond %{REQUEST_URI} ^/(En|Es)$ [NC] RewriteCond - The condition of which a rewrite will take place %{REQUEST_URI} - The URI that is requested from the server (everything after the domain and TLD i.e. moz.com/community 's URI would be /community) ^ - Denotes the beginning of a regular expression (regex) / - literally just / (En|Es) - the '()' are simply a grouping and the '|' means OR. So this is saying En OR Es $ - Denotes the end of a regex [NC] - Means no case, so everything in this is not case sensitive So literally this is saying execute this rewrite when the requested URI (after the .com or whatever TLD you use) is either /En OR /Es then whatever, with no attention to case RewriteRule ^(En|Es)/(.*)$ $2?lang=$1 [L,R=301] RewriteRule - The executed rule when the aforementioned RewriteCond is met. ^ - Denotes the beginning of a regular expression (regex) (En|Es) - the '()' are simply a grouping and the '|' means OR. So this is saying En OR Es / - literally just / (.) - This is a wildcard. Once again the () is a grouping, but here the . means zero or more arbitrary characters $ - Denotes the end of a regex $2 - this is the second captured grouping in this line. Meaning whatever is defined within (.*), which is everything after En/ or Es/ ?lang= - this is literally writing '?lang=' without the 's. $1 - this is the first captured grouping from this line. Meaning whichever En OR Es was captured will be written here. [L] - Tells the server to stop rewriting after the preceding directive (rule) is processed [R] - Instructs Apache to issue a redirect, causing the browser to request the rewritten URL [301] - Corresponds to a Moved Permanetly Header Code [L,R=301] - Combines all 3 of these into one. For this I think it's easiest to just use an example. moz.com/En/htaccess-is-fun will be our example Since this url passes the RewriteCond, it goes on to the RewriteRule where it finds En OR Es and stores that value as $1 (En) then takes whatever is left and stores it as $2 (htaccess-is-fun). It then writes htaccess-is-fun?lang=En and replaces the original selection (which is En/htaccess-is-fun) with the new rewrite making the result moz.com/htaccess-is-fun?lang=En . The new URL is served as a 301-ed redirect. RewriteCond %{REQUEST_URI} !(.[a-zA-Z0-9]{1,5}|/)$ RewriteCond - The condition of which a rewrite will take place %{REQUEST_URI} - The URI that is requested from the server (everything after the domain and TLD i.e. moz.com/community 's URI would be /community) ! - declares negation. i.e. "!cheese" matches everything except "cheese" () - is again a grouping \ - escapes a special character. So "." means a literal dot. a-zA-Z0-9 - matches all lowercase letters, all uppcase letters, and all numbers {1,5} - matches one to five of the previous designation. Meaning that there can be any combination of a-z, A-Z, or 0-9 in a sequence of one to five. i.e. A2ps OR 12345 OR AbC etc. | - Means OR / - literally just / $ - Denotes the end of a regex RewriteRule ^(.*)$ $1/ [L,R=301] ^ - Denotes the beginning of a regular expression (regex) (.) - This is a wildcard. Once again the () is a grouping, but here the . means zero or more arbitrary characters $ - Denotes the end of a regex $1 - this is the first captured grouping from this line. / - literally just / [L] - Tells the server to stop rewriting after the preceding directive (rule) is processed [R] - Instructs Apache to issue a redirect, causing the browser to request the rewritten URL [301] - Corresponds to a Moved Permanetly Header Code [L,R=301] - Combines all 3 of these into one. So whenever the RewriteCond is met, this rule will select everything and then rewrite it as a 301 with a / trailing it. Hope this helps! Let me know if you have any other questions. Regards, Trenton

    International Issues | | TrentonGreener
    0

  • Hi Mike, You need to look at this 2 ways. 1. From the machine perspective. From the google bots point of view, comments, white space and formatting etc are redundant, they don't make the content easier to read as its only interested in the code itself that gives it information. The minifying effect simply makes less data to read/download making the site faster. From an SEO point of view this is a advantage. 2. From a developers point of view. From a developer point if view, having the html legible is important for all the obvious reasons. Minimising makes the code horrific to edit and develop. SEO wise this confers little advantage, as making is this way increases the size of the document. The solution. Maintain 2 versions of the files. A development copy which is nicely formatted and easy to develop and a minimised version that you recreate whenever you make changes. There are countless tools that will auto minimise your html, css and javascript quickly and easily for you. If you are using a CMS such as wordpress / magento etc then this is less likely to be something you can address, however, i do believe add-ons and plugins exist that do this for you automatically.

    Web Design | | ATP
    0

  • Hi Cosi, Check also if you have registered the right protocol of your domain http vs. https or if you have register the right subdomain (maybe you're not using www or non-www, but another subdomain?) - definitely looks as the type of situation where you haven't register the property with the "right" name  and the final location of your site where your content and information is (maybe you're redirecting to another place?) if you don't see any data for it, neither indexation, crawling or search visibility. Another hypothesis would be that you still have very very few pages and that's why you don't obtain any data for the search visibility, however, you should be able to see it on the crawling and indexation reports. The final one would be that you're blocking all crawling and indexation from happening on your site and therefore you don't have any. Take a look at the robots.txt configuration, also any potential blockage at other levels, such as server level in htaccess or even with the noindex meta robots tags (although that would block the indexation and not the crawling). Thanks, Aleyda

    On-Page / Site Optimization | | Aleyda
    0

  • Not sure if not yet lunched meaning it is a brand new site and domain or if this is going to be a new edsign/cms etc.. so, If the site never went live you should have no concerns google spider coming to crawl your site. But if it is an updated version of an existing website, its better to create a subdomain. This is the most common practice, you create something like test.domain.com and use robots.txt on this test domain to block all access. it is the safest way that i know good luck

    On-Page / Site Optimization | | Yoav-Blustein
    0

  • Hi Bob, This isn't really the best place to ask this question since the valuable contributors here aren't here to self-promote or actively pick up work from what is essentially a help forum. To point you in the right direction, Moz does have a Recommended Providers list which is working checking out.

    Intermediate & Advanced SEO | | ChrisAshton
    0

  • Thanks - good question. I agree with a previous poster that any website should possess friendly URL'S, such is a given these days and ultimately aids user awareness and navigation. You really should carefully research and think about the time required for URLS rewrites - these can be quit dangerous and if not done properly can damage your rankings. Certainly breadcrumb trails are a big plus so that's something I would definitely recommend.

    Search Engine Trends | | Stewart_SEO
    0

  • Great explanation about search console error. I had faced the same issue in the search console with some my of keywords related to Check Iqama Expiry. But now, it's working fine.

    Search Engine Trends | | ndhsne45
    12

  • Thanks EGOL.  Still looking for additional evidence about this.

    Intermediate & Advanced SEO | | RosemaryB
    1

  • I just thought of another one last night... I would do terrible things to have some way of knowing whether or not a domain has been penalised in the past. The fact that a domain can essentially be rendered un-rankable if the history is bad enough is worrying. The fact that we can only possibly learn this the hard way is terrifying! No matter how well you manage expectations and communicate with the client, if you get stuck trying to rank a domain in that state you will fail and you will also be the one lumped with 100% of the blame. I suppose this would require some co-operation from Google to make it possible which is likely why it doesn't already exist.

    Online Marketing Tools | | ChrisAshton
    2

  • Thanks Yossi, this is kind of what I expected I think.  I guess the question should have been "has anyone had Moz crawl issues with their Zendesk support site"? The main issue with our support site is that Zendesk does not allow access to the robots.txt file so there is no way to add regular expressions like the wildcard search/* to it. I will re-post the question as above.

    Other Research Tools | | zspace
    0

  • AH! OK, gotcha. In that case, Martijn was right - you'll need to add the Review type. Required fields for the Review type are: reviewBody (text) reviewRating (of type: Rating) author (of type: Person or Organization) So the markup would look something like this:

    Technical SEO Issues | | RuthBurrReedy
    0

  • What exactly are you planning to render server-side? In principle, you shouldn't have anything to worry about if you render everything server-side, provided the rendering isn't so slow that it affects Google's measures of page speed. What do you see when you use the 'Fetch and Render' feature in Search Console at present?

    Technical SEO Issues | | StephanSolomonidis
    0