Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • The good news is that a recent update has been made, and we're ready to accept beta SNI customers to test it out. Please see the article here that explains this further: https://moz.com/community/q/moz-pro-our-web-crawler-and-sites-that-use-sni and if you're interested in being a part of the beta testing, the link to the sign up page is here: http://goo.gl/forms/LCvL9Ix8JDHfbAvr1

    Getting Started | | KristinaKeyser
    0

  • If the pages are already indexed and you want them to be completely removed, you need to allow the crawlers in robots.txt and noindex the individual pages. So if you just block the site with robots.txt (and I recommend blocking via folders or variables, not individual pages) while the pages are indexed, they will continue to appear in search results but have a meta description of (this page is being blocked by robots.txt). However, it will continue to rank and appear because of the cached data. If you add the noindex tags to your pages instead, the next time crawlers visit the pages they will see the new tag and remove the page from the search index (meaning it won't show up at all). However, make sure your robots.txt isn't blocking the crawlers from seeing this updated code.

    Technical SEO Issues | | OlegKorneitchouk
    0

  • When it comes to site-wide links, first consider the users--if you think they will actually click on those links and it would be helpful for them, then you should include them. If you were to include those links in the footer, they should be nofollow links.

    White Hat / Black Hat SEO | | GlobeRunner
    0

  • As per Eric, it has indeed been turned off. To be fair I had not used it for a while due to it being unmaintained and I wasnt that sure if it was accurate anymore.. Other solutions that I have used for authority signals are developed by MOZ. I would suggest installing the MOZ toolbar extension and also take a look over the opensiteexplorer. You can get some information you want from there like Page Authority and Domain authority. plus spam score and more.

    Link Building | | TimHolmes
    0

  • Finally we were able to get a 200 response. Thank you for your input.

    Technical SEO Issues | | ang
    0

  • Dmitrii Yes it helps a lot, your opinion is valued. My designer has evolved with technology and he offered both custom with minify abilities as well as WordPress (which I'm not in favor of). He makes good points about both, but really want to hear from my Moz Community before we make the plunge later this year, early next. Thx

    Web Design | | KevnJr
    0

  • Hi Dmitrii,  Thanks for the response! Our brand is quite known in our more established territories, but not where new locations are launching. Our industry is exterior painting, so we want to capture those users searching locally (example, exterior painters in (city name)).  My thought process was to optimize the "microsite homepage" for a local keyword query since our main website's homepage already ranks for our brand-name. So if there are branded searches, users will still land on our site. I'm new to SEO, so I'm just not 100% sure this is the best strategy.

    Intermediate & Advanced SEO | | kimberleymeloserpa
    0

  • Very hard to respond to this, but I'll do my best. First, there's no real point in starting a new site if it's going to be thin content all over again. If you just post videos from other sites with some text, you're just a re-aggregator of content. I'd think about what you can do to make your site actually more useful. Think about things like this: Are there search,filter, rating, or navigational features your competitors are using that your site lacks? Are your titles accurate and specific, or vague and targeted towards similar terms? You want to be accurate and specific. You don't want to mislead anyone with promised value only to disappoint them. How can you add useful content at scale? I'd generally lean towards using the existing site, but it can take a while for Google to reconsider it. It's your call, but the important thing is to make sure your pages aren't thin and you're delivering value.

    White Hat / Black Hat SEO | | Carson-Ward
    0

  • Thanks for the guidance. I'm looking at SEMRush also.

    Moz Tools | | Eric-Kinaitis
    0

  • Hreflang is used at a page level, not at a site level. So no, you should not just set the hreflang tag on every page of www.mywebsite.com to read: So yes, if the German translation of the page www.mywebsite.com/page.html is available at www.mywebsite.de/page.html then you must do 2 things: 1. On www.mywebsite.com/page.html  use  <link rel='alternate' hreflang='de' href='http: mywebsite.de="" page.html'="">and <link rel='alternate' hreflang='en' href='http: mywebsite.com="" page.html'=""></link rel='alternate' hreflang='en' href='http:></link rel='alternate' hreflang='de' href='http:> 2. On www.mywebsite.de/page.html  <link rel='alternate' hreflang='en' href='http: mywebsite.com="" page.html'="">and <link rel='alternate' hreflang='de' href='http: mywebsite.de="" page.html'=""> </link rel='alternate' hreflang='de' href='http:></link rel='alternate' hreflang='en' href='http:> What this means is that the English page should link to itself and to all other language variants. And there should be "return tags", i.e. each of the language variations should link to themselves and to all other language variants.

    Intermediate & Advanced SEO | | NickJasuja
    0

  • Thanks Miriam - that makes good sense - many thanks for your feedback Luke

    Intermediate & Advanced SEO | | McTaggart
    0

  • I've tried them all for my agency - what I've found is it's less about the tool and more about your process. The tool is simply an organizational / automation layer on top of your processes. We ended up finding a project management solution that ran in Google Sheets that we love. We were able to highly tailor it to our needs through customizations. It works for us!

    Online Marketing Tools | | ryanwashere
    0

  • Thanks Matt....the blog helped me a lot.

    Keyword Research | | sandeep.clickdesk
    0

  • Hi Rudd, You have answered yourself to your questions by giving the background of the issue (thanks for doing it too as this is key to identify these type of issues): "A week ago we changed our canonical links which were actually randomly referring from .be > .nl and .co.uk to .com." This was certainly the problem which is still causing this as a consequence today... as you were telling before to Google that the original versions of your Belgian pages were the Dutch ones, and the same with the UK version pointing to the .COM one, so Google did what you were specifying and that's the reason why it was showing these other pages in their cache instead of the Belgian and UK ones, because they were "their originals". This has nothing to do with hreflang... it had to do with the way you were configuring your canonicals and what it meant. It's great that you already fixed your canonicals though The thing is that Google seems to not having yet updated the information... you will certainly give it a bit of more time if that just happened a week ago. My recommendation is that you should resubmit for recrawling your Belgian and UK versions via Google Search Console so Google recognizes your current configuration and not the old one... once they do it, they will correctly show the right content in the cache, as now you have your canonicals set to inform that they're the original versions of themselves. Thanks, Aleyda

    Local Strategy | | Aleyda
    0

  • Normally in cases like your site Google is showing a message that the remaining results are similar to the ones. Some reasons why Google is not displaying all results: You have a lot of semi-duplicate pages like http://www.blackbowchauffeur.com.au/theme-park/aussie-world-to-cedar-creek-from-cedar-creek-to-aussie-world/947583845769996550 Some of your url's have a backslash / others not - in your internal linking you use both. The backslash version have non-back slash as canonical (not sure if this is always the case) You redirect url's with capitals to the non-capital version - but you use the version with capitals in your navigation pages like http://www.blackbowchauffeur.com.au/airport-transfer-connection-to-and-from-suburb-starting-with-h/ don't really add value for the index either you might also want to check your 404 pages - they look like a 404 - but non existing pages are redirected with 302 to 404 page with http status 200 - example http://www.blackbowchauffeur.com.au/price/lismore-airport-to-highfields-from-highfields-to-lismore-airport/faq -> 302 to http://www.blackbowchauffeur.com.au/404-page-not-found which unlike the page is stating has a http 200 status Summary - after crawling 1892 url's of your site - my crawler found 1215 redirected url's (quite possible that these count as url in the results when you do site:mysite.com) as you continue to use them in your site's navigation. From the url's that are actually working a lot of them are near duplicates. Use a tool like screaming frog to clean your internal links - point internal links directly to the final destination. Check if you really need all these detailed pages - they are just adding weight to your site and I am not sure if Google will index them properly + show them in the search results. Also correct the 4xx page - it should show 404 as http status. Apart from all of the above - the site:mysite.com is not the most reliable way to check which pages are in the index. Check your search console (and the landing page report in Analytics) & combine this info with a log analysis to see which pages Googlebot is actually visiting. You will learn much more about issues with your site than using the site: command. Dirk

    Inbound Marketing Industry | | DirkC
    0

  • Personally, I'd say to make 100% sure all your 301 redirects are placed properly, and that each old URL is redirected to the absolute best new URL to get the user what they're looking for.

    Technical SEO Issues | | MattRoney
    0