Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • This is a long and detailed query so I think, it will be best to annotate your question with my responses: "Dear friends, We have a multi-regional website in English language only having the country selector on the top of each page and it adds countrycode parameters on each url. Website is built in Magento 1.8 and having 1 store with multiple store views." - it is probably better to go with a folder-structure based regional deployment as Google doesn't tend to weight parameter URLs very strongly at all, unless there are link / citation signals which prove the child page version (parameter based) is more popular than the parent (in which case, they can shuffle around) "There is no default store set in Magento as I discussed with developer. Content is same for all the countries and only currency is changed. In navigation there are urls without url parameters but when we change store from any page it add parameters in the url for same page hence there are total 7 URLs." -this sounds incredibly complicated. It sounds like at some point, someone will leave or forget how things work and you will be in a big mess "6 URLs for each page (with country parameters) and 1 master url (without parameters) and making content duplicity." - yes I can see how that would be a problem. Also you said there was no default URL, but now says there is a master URL. Surely master is default? This may need more explaining for myself or others to help your properly. By the way, something very important here - if you're just planning to use hreflangs on their own and change pricing, very often Google won't consider that a good enough effort to give you an international footprint. Google think, hmm if you really have identified these new audiences across the world, even if they speak the same language - they are different people with a different culture. Should your content really be EXACTLY the same? No. If you do bother to do different content for different audiences (even if they speak the same language) which is tailored to their cultural nuances - you will probably get more international rankings. If you don't and you're just doing the cheapest fastest thing, you have no value proposition for Google and thus don't expect to win big (or even at all) "We have implemented hreflang tags on each page with url parameters but for canonical we have implemented master page url as per navigation without url parameters Example on this page." - just so you know, a canonical tag acts almost like no-index tags. It says to Google: I am not the main version of this page, so please never index me. Instead index this canonical URL I am linking to instead. As such, with your current implementation, all of your regional URLs will be taken out of Google's index unless popularity signals contradict your canonical tags (in which case they may be overridden). Think about it. With hreflangs you are telling Google: go over here and index my other language version. So Google goes over to another page, but that page says: Google I am not canonical, why are you even here? Go to the canonical master only don't look at me. So you are really confusing Google by telling them to index pages with Hreflangs, then telling them not to with canonical tags "I think this is correct for master page but we should use URL parameters in canonical tags for each counry url too and there should be only 1 canonical tag on each country page url. Currently all the country urls are having master page canoncial tag as per the example. Please correct me if I am wrong and in this case what has to be done for master page? as google is indexing the pages without parameters too." - with your current implementation, Google should (most of the time, this is not absolute) only be indexing the master pages and not indexing any of the regional pages. The regional pages all tell Google that they are not canonical and not good for indexing, by using the canonical tags you are telling Google to only index the master. I would personally remove all canonical tags from all regionally appended parameter URLs. If you have parameters firing for other reasons (e.g: changing tabbed content, moving a carousel, UTM campaign tracking) then those should be trimmed out of Google's index using canonical tags. That being said; for your regional parameter URLs, it's a different story. You want your regional pages to rank - right? So don't tell Google they are non-canonical, by putting canonical tags on them pointing to the master. In-fact I might even put some of them in a Sitemap.XML and feed them to Google. I would only do this, where the regional modifier is the ONLY parameter in the URL. If there are others, I might still use canonical tags - but for just the regional modifier on its own, they should be stripped of canonical tags (if you want them to rank ever) "We are also using GEOIP redirection for each store with country IP detection and for rest of the countries which are not listed on the website we are redirecting to USA store. Earlier it was 301 but we changed it to 302. Hreflang tags are showing errors in SEMRush due to redirection but in GWT it's OK for some pages it's showing no return tags only. Should I use x-default tags for hreflang and country selector only on home page like this or should I remove the redirection? However some of the website like this using redirection but header check tool doesn't show the redirection for this and for our website it shows 302 redirection. Sorry for the long post but looking for your support, please." - Support is here! Two main things. Firstly code 303 might be more appropriate than codes 302 or 301. I would not bother with X-Default unless you really know what you are doing, since you are already in one Hell of a mess I would not touch that yet. Fix the basics, wait for the dust to settle! Finally, all you need to do for Google is to exempt Google's user-agent of "googlebot" from your regional redirects. That way they don't get bounced around, but users still do

    International Issues | | effectdigital
    0

  • So many good responses, thank you! I knew there was better stuff out there than the same old same old.

    Keyword Research | | BobGW
    1

  • Most site migrations, whether they involve a redesign or not (or whether they move domain or simply alter existing architecture, e.g: HTTP to HTTPS) will incur a small dip in performance, yes Usually when you perform a site migration, it's for strategic and not tactical reasons. You're usually thinking of the long term. Your belief is that in the long term, the new domain and / or design / architecture will perform better than the old one(s) If redirects are not properly handled, you could lose all of your traffic quite easily. If redirects are handled correctly, you're in a much better position but still likely to suffer some small indent in terms of performance (usually not lasting longer than 1-2 months - if you keep producing good content and earning great links) What you have to remember is, if you always play it safe and never 'evolve', you might incur less cuts and bruises now and then - but you will die faster. As others overtake you through their efforts, you sink and fall behind. It's worth striding out there, taking a few nicks and cuts - to preserve your overall life-span for longer (think of it like regular rigorous exercise, it's painful when you do it but later you see the benefit) 301 redirects an translate up to 100% of your SEO authority from one place to another, but they won't always. If there are too many links to redirects that can make them slightly less effective. If redirects begin to chain (redirects to redirects) or if the wrong type of redirect is used, that can drastically affect the transfer and you could see as little as 0% of the prior SEO equity on your new domain. Another thing, if content is relatively different (in machine terms, think Boolean string similarity comparison - NOT "oh yeah as a person it looks similar to me") on the old and new pages, that can directly obstruct 301 redirect SEO authority transfer. Google has chosen to rank X page, if you replace it with Y content then it becomes a risk to Google. If content is mostly new, it mostly has to prove itself again (and redirects become largely nullified) To some extent you can get around this by performing backlink amendments. Getting webmasters to change their links to your site, so that they hit the new domain / architecture and not the old one. This means that the backlinks are not flowing through redirects, and thus Google can have more confidence that the new content is just as good (for similar search terms) as the old content was. If many webmasters disagree to update links for you, that could be a sign that your old content was more useful than your new content (so roll back!) Your new domain, if it hasn't been used before (ever) may be sand-boxed by Google for a few weeks. That can be a normal thing, until Google digests all the redirects, re-linking and your usage of Search Console's change of address tool (which you absolutely should use, but don't mess it up by even one character or you'll cause yourself months-long headaches) Sometimes if everything goes swimmingly, you can get very lucky and not even see a dip at all. That's not the norm, so don't set all your expectations around that

    Technical SEO Issues | | effectdigital
    1

  • Hi Alex, Apologies for taking so long to reply to your thorough answer (I work one day a week for this client). This is very useful and clarifies the procedure we have to go through. I did contact Shopify, it is a great platform and they were quite helpful concerning the apps that the platform uses but not helpful when it came to arranging the redirects. So, thank you again for your reply, it's going to make a difference to our SEO! Jim

    Technical SEO Issues | | LucyBee
    0

  • Fab, thank you both for your thoughts. I was 99% sure I was going to do it, just needed someone else to agree with me

    Technical SEO Issues | | RebekahVP
    0

  • Google's John Mueller has stated that exit intent pop-ups do not attract a Google penalty: https://www.youtube.com/watch?time_continue=746&v=gS4_JH-QqSg "What we’re looking for is really interstitials that show up on the interaction between the search click and going through the page and seeing the content. So that’s kind of the place we’re looking for those interstitials. What you do afterwards like if someone clicks on stuff within your website or closes the tab or something like that then that’s kind of between you and the user." Google won't be monitoring the user's full behaviour on your site beyond the initial bounce/non-bounce (at least as far as the search arm is considered, Analytics is, of course, different!), e.g. they won't see that x happened that caused (or appeared to cause) the user to leave.

    White Hat / Black Hat SEO | | Xiano
    1

  • Thanks again AL123al! I would be concerned about my internal linking because of this problem. I've always wanted to keep important pages within 3 clicks of the Homepage. My worry here is that while these products can get clicked by a user within 3 clicks of the Homepage, they're blocked to Googlebot. So the product URLS are only getting crawled in the sitemap, which would be hugely ineffcient? So I think I have to decide whether opening up these pages will improve my linking structure for Google to crawl the product pages, but is that important than increasing the amount of pages it's able to crawl and wasting crawl budget?

    Web Design | | Frankie-BTDublin
    0

  • Google will sometimes use the description tag from a page to generate a search results snippet if we think it gives users a more accurate description than would be possible purely from the on-page content." (Bold is mine) https://support.google.com/webmasters/answer/35624?hl=en

    On-Page / Site Optimization | | jacobmartinnn
    0

  • Good question, actually that's easy to do! You have to get lots of links from high quality websites with low spam score. There are some guys that offer a service that can help you out increase DA in 14 days or just add Alex on skype at:an71qu3 They did a great job for my website so why not recommend them lol Hope it helps Aspirant ss56-223-2.png

    Getting Started | | an71qu33
    1

  • Hi Never going to disagree with Rand. The first analysis is the actual SERP you want to rank on. So audit the SERP.  What are the Title tags that are ranking? What is the content that is working? So start there on your Title tag. That should make it an easy decision. WordPress only auto populates if you want, can set as per your requirements. Confused on menu title.. is that the navigation?

    Intermediate & Advanced SEO | | ClaytonJ
    1

  • There are numerous aspects that go into page speed. When you run an analysis, whether in Pingdom, GTMetrix, or Google Page Speed, you will get insights about what is slowing down your site. These are the aspects you want to focus on. The best I find is GTMetrix. It gives you a step by step list of everything slowing down your site. Then, you can research each point and find ways to improve them. That said, here are a few areas to check: 1. Are you running the highest version of Wordpress and PHPadmin. You want at least 7.0 or higher for PHP admin, and you want to be using Wordpress 5.0. 2. Your server speed and the quality of your host will play a bigger role than anything else in the speed of your site. Check with your hosting company if your server can handle your site. With hosting, quality is a lot more important than price. You can get cheap, shared hosting from an unreliable supplier, but you will get the site speed that goes with it. You want a server that's located in the country where most of your traffic will come from, and you never want it to be "just good enough". It's like towing with a truck. If you need to tow 5,000 pounds, you'll want a truck that can tow at least 7,500 pounds with ease. Not sure if that makes sense, but the point is, make sure your server is capable of towing a lot more than your current site. If you're not sure which hosts are best, do some online research. I won't name names here, but there are 3 or 4 hosting companies that are constantly head and shoulders above the rest. 3. Your images also play a huge role in your site speed. Make sure they are optimized. Again, Page Speed and GTMetrix will let you know how big of a role your images are playing in your slow site. 4. Research Leverage Browser Caching. 5. Research "slow wordpress site". There is a guide available that helps you go step by step to improve speed. Site speed is a range of factors. There can be hundreds of factors hurting your site but the four above are the major ones. Don't worry too much about your theme. It plays a role, but no theme can be fast if the host is slow, the images are too big, you don't enable Leverage Browser Caching and your Wordpress and PHP versions aren't up to date. As for your plugins, there is a plugin that checks which of your plugins are slowing down your site the most called P3, but I wouldn't recommend it. It caused a few problems on our site when we ran it. That said, the rule of thumb with plugins is delete anything you don't use, don't use two plugins that do the same thing, and make sure you use reliable plugins that have good reviews and are updated regularly.

    Intermediate & Advanced SEO | | CJolicoeur
    1

  • Alan, Spot checking, I looked at a few of your redirected old domain pages which are still showing up in the index. Here are my thoughts: For the ones which are redirected properly, you might be experiencing a ranking dip because the old domain name contained keywords better matching search terms. for example, if a search contains "office" or "space", it could be that your old domain was perceived by Google as a stronger match. Although Google doesn't place as much weight on exact or partial match domain names as was once the case, it is still a ranking factor, from my recent experience. URLs such as http://www.nyc-officespace-leader.com/neighborhoods seem to be properly redirected. From what I can tell, the redirects look fine. Other URLs such as http://www.nyc-officespace-leader.com/Chelsea are being redirected to a non-existent page, with a 404 error. Probably because you are using a pattern redirect which is preserving the requested file path. For any of these which previously had rank equity, I would put some effort into redirecting to some relevant existing page. I do see that your old URLs here http, and the new ones were https. The redirects are working fine, but you might want to check that when you did the change of address, it was on the http version of the old property (to the new domain). Even though we're not supposed to use change of address to go from http to https migrations, it is still necessary to change the address from the specific version of the old property. So, if you previously had both http and https versions of your old property, you want to make sure you've done the change of address tool for the http one.

    Intermediate & Advanced SEO | | seoelevated
    0

  • Hi Alan, Kinsta is the host and company that offers a container which is more powerful than a VPS in most cases. You get amazing support for things that you are asking questions about here. They will set up to github for you they will do most of the things that I think you want to have a  really authoritative opinion on. Kinsta will make code change for you ServeBolt will tell your developer what to do ServeBolt  offers Bare metal hosting  on  their custom COLO NVMe drives that are on a VPS  are extremely  customize and much faster than  and your current host  is far superior support. I’m sorry to hear that you have paid in advance you may want to ask if you can get out of that if you cannot You can  hire a company like Performance Foundry to do a lot of the work correctly. It sounds like your server is loaded with plug-ins and is not being run correctly by you’re  Developer to be honest  you may want to contact codeable and have a simple code audit done for $400. **https://performancefoundry.com/ is great ** https://codeable.io/ has hundreds of great Your question about Plugins After briefly looking at your site I can tell you the code was not done very well but I would like to login before saying that See below: https://codeable.io/how-many-wordpress-plugins-too-many/ https://wp-rocket.me/blog/wordpress-plugins-many/ https://torquemag.io/2018/02/wordpress-plugins-many-many/ How Many Plugins is Too Many? There’s not a number of plugins that’s set in stone for all users. It depends heavily on the kind of web host you use, though. For shared or budget cloud hosting, stick between 0 and 5 plugins. If you use cloud hosting, VPS hosting, or a dedicated server, you can run anywhere from 5 and 20 plugins on your site without many issues. Dan Norris, co-founder of WP Curve, recommends to never exceed 20 From what I understand of your site and I have to be honest I wish I could log into it and give you more insight. It does not look like it was built very well. It definitely is not being maintained very well by your daughter no one should have 50 plug-in is never a need for that even on site that has 4000 jobsite I have never seen anything over 25 ever. The fact that your developer is not removing the plugins is a really bad sign Hope this helps, Tom

    Web Design | | BlueprintMarketing
    1

  • Yes it will cause a problem if you just do it in such a basic way. Google crawl from numerous data-centres, based in different countries. As such, Googlebot will crawl from different places and will keep thinking different areas of your site are going up and down all the time. The remedy of course, is to exempt Googlebot (user-agent) from your redirects

    White Hat / Black Hat SEO | | effectdigital
    0

  • No problem Tom. Thanks for the additional info — that is helpful to know.

    Technical SEO Issues | | Nomader
    1

  • Hi Davit, I see! Yes, I do think you should start a new thread specific to technical issues, as they are different than the original scope of this thread. You could get some fresh eyes on it!

    Intermediate & Advanced SEO | | MiriamEllis
    1

  • What would a user who wasn't logged in see if they visited it? Depending upon your company, I would suggest exposing the knowledgebase, that could possibly rank for people looking to solve problems that your products would help with. Same with the forum. If you chose to do that, you might be better off simply moving that content to somewhere else in your site structure. For everything else, Google wouldn't typically be able to login, and what would you actually show it? It's not a customer, so it would have any tickets or products to list serial numbers of. You could detect that it's GoogleBot and show something else, but that's very bad practice!

    Intermediate & Advanced SEO | | Xiano
    0