Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Relative URLs can be used, but it's still superior to use absolute URLs to avoid any mistakes down the line. i.e.  you miss a 301 redirect on a subdirectory and both HTTP and HTTPs versions resolve. Relative URLs can be used in a pinch, but aren't recommended.

    | Cyrus-Shepard
    0

  • Hi Dirk, That sounds great. Thanks for your help, I am going for action on this solution. Marcel

    | MarcelMoz
    0

  • Header 1. It's the main heading on the page. On the page you're referencing, the H1 tag contains "We Do Hadoop in the Cloud". Your H1 should contain the keyword phrase you're trying to rank for. There should only be 1 H1 on the page.

    | DonnaDuncan
    0

  • Hi Olivier, The question that Matt is answering is quite specific - is a "coming soon" page bad for new domains as Google seems to prioritize new domains. The answer is "no" for this situation. The situation of your friend is quite different - he has an existing site which is generating some traffic. Unless your "coming soon page" is extremely interesting & rich in content, it will not rank for any keyword - basically you're reducing your site to a one page site, so bounce rate will inevitably be very high. If you want to proceed, you just have to put up a redirect rule that is redirecting all url's to the index page of your site (which would then be the "coming soon" page). rgds Dirk

    | DirkC
    0

  • Hi Chris, Ryan providing you with a good synopsis of next steps to take. Did it answer your question? If so, please mark it as such. If not, please ask for clarification of the response (or an update if you have already resolved this issue otherwise.) Thanks! Christy

    | Christy-Correll
    0

  • Thanks Brendan for your reply.  Changing the link to the TrackTest homepage would be an obvious choice but is not possible in this special case due to some other reasons.

    | tracktest.eu
    0

  • Hi Brendan, That was really close I tried that one but did not work. I spotted the problem in "Custom Post Type Archives" in "Post Types" tab in SEO by Yoast plugin. I unticked that option and everything seems fine now. Thank you so much for your help. Cheers

    | gpapatheodorou
    0

  • Great news - strange that these 608 errors didn't appear while crawling the site with Screaming Frog.

    | DirkC
    1

  • Not sure about this specific message you got, but very often messages like this from Webmaster tools only include a few sample links, and then you have to figure it out yourself which the other pages are where the problems occur (Google wouldn't dare to make our lives too easy)

    | DirkC
    0

  • It's the only page on that site with a review - I wonder if that might have something to do with it? I'll ask them to resize and smush the images but as you say, it's ranking for other keyword combinations, so that's likely not the root cause. It's a strange one.

    | C-Tech
    0

  • I was just addressing that few days ago. Google have been kind enough to provide a very detailed guide on how to do that: https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/separate-urls You need to annotate your html accordingly, all the details are there, just read it.

    | max.favilli
    0

  • Hi Anukul As far as I am aware, it is not possible to fetch and render via an API - however you can double-check the documentation here: https://developers.google.com/webmaster-tools/ It sounds like what you really need to work on perhaps is Crawl Optimization. If Google thinks the site is important enough, they will crawl it fast enough to catch all of the changes. AJ Kohn wrote and excellent post on it here which I highly encourage you check out. Here are a few other resources on getting Google to crawl your site faster: https://blog.kissmetrics.com/googlebot-optimization/ http://moz.com/blog/seo-log-file-analysis-part-2 http://googlewebmastercentral.blogspot.com/2009/08/optimize-your-crawling-indexing.html Hope those help!

    | evolvingSEO
    0

  • Hi Matt, Thank you for your response. I'm not sure what duplicate content you are referring to? We are in the process of cleaning up our link profile.

    | Citybase
    0

  • the moz bar says they are 301'd so presume so but will double check with dev i suppose if all definately are 301'd for sure, then just wait and see what haps after the sitemap is updated correctly and then take it from there

    | Dan-Lawrence
    0

  • I agree with Laura, Big Commerce, OpenCart, OSCommerce, these would all be great solutions. I think Big Commerce is the best hosted platform on the market right now. OpenCart and OSCommerce are not hosted solutions, and they are both open source so you would have to have some coding experience to get them set up.

    | MonicaOConnor
    0

  • Hi Kraismir, While you're correct about the prev/next tags, it's extremely unlikely that anyone will want to use page 2 as an entry point to your website. Also, there is no content on page 2 which differentiates itself from page 2 - the title tags and meta descriptions are also the same. This is a very common issue with paginated pages and there are many examples of sites using the technique I described above. Take a look at the source code of page 2 of Yoast.com or page 3 of Laughing Squid and you'll see the same implementation.

    | taryn_s
    0

  • As has been suggested you can request the removal of pages in GWMT and you should keep any wordpress site and plugins up to date. To add to this, you might want to look at something like Cloudflare as an extra layer to protect your clients site. We've been using it for a year now and its made a massive difference, both to performance and security.

    | N1ghteyes
    0

  • Alick300, Thank you. That was the true evidence straight from the Google source that validates this question my client had! Thank you again.

    | MrGlobalization
    0

  • If the filters are working via parameters you can go into GWT and add them to the exclusion list.  Robots.txt might be able to apply if you were able to serve filters via a folder through a bit of URL rewriting.  Mostly it's a question of getting Google to ignore the duplicate content so research specific to your clients situation there will be a good start.

    | RyanPurkey
    0

  • Hi Brendan! The post you linked is really interesting! Thanks! There are so much info out there that it’s quite easy to miss the good stuff Yep, and for the on-page grader I already checked that one: A Thank you very much for your kind help

    | Bizio
    1