Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • This seems more of a link algorithm with a mix of keyword rich urls update. We also track around 20,000 keywords on a weekly basis and can see a big flux. This seems more like Google devaluating some of the backlinks or sites in masses which is causing this. There is no specific pattern which we can see. It seems more of the small business websites getting hit than the Brands. But again too early to single out specific reasons.

    | Navneet
    0
  • This topic is deleted!

    0

  • Yes - indeed. The 301 need to be relevant. You can not redirect www.domain.com/red-pantss to www.domain.com/blue-sweaters /red-pantss --> /pants or even better /red-pants

    | alsvik
    0

  • Thanks for your response Irving Weiss. Our webmaster made a couple of changes since this post, which I'll list at the end. First a) Prior, the robots.txt file was.. User-agent: * Robot-version: 2.0.0 Crawl-delay: 2 Request-rate: 1/4* Sitemap: http://www.888knivesrus.com/sitemap.xml Disallow: /c=/ b) No and unfortunately the edit/add button is missing from the parameters section in our account. c) not that we've found d) It dropped from 5.7 to 5 million on 1/1, and has remained there. Some updates: Our webmaster made a couple of changes yesterday to address this issue. Some of research we found said blocking the session id parameter in robots.txt file was preventing Googlebot from seeing the rel=canonical in place and it should be removed. They made an update to the robots.txt  removing it. An x-robots tags of noindex and nosnippet was also added to the pages The webaddress is www.888knivesrus.com Thanks again!

    | marketing_zoovy.com
    0

  • I noticed this with one of my clients who is a real estate agent.  Our placement stayed the same, but the organic has now blended in with the local result.  One big down side right now is that it cuts our Title tag in half because the Local result is shorter.

    | esztanyo
    0

  • Are you sure your hosting plans doesn't cover more than one site? You can probably set up a new site right along site the existing one on the same server / hosting package. If not then you can easily upgrade your existing hosting contract. If these sites are directly competing for the same keywords it probably makes sense to keep them separate and go with some other hosting company so that they are seen as being distinct.

    | irvingw
    0
  • This topic is deleted!

    0

  • Thank you for the responses. Thought that might have been the issue, but with your responses, I will move forward to try to get the links to appear more natural. Again, really appreciate everyone's input!

    | Tosten
    0
  • This topic is deleted!

    0

  • Thank you. Sounds like I'm heading in a worthwhile direction and will plug on!

    | LinkMoser
    0

  • Hi David, It's odd that Roger would pick up on so many 302s. When I ran a crawl of your site with Screaming Frog and the only 302s were for static assets, which I don't think roger would be crawling anyway. What sort of URLs are you getting the 302s for? If you think these are errors, feel free to contact the help team (help@seomoz.org) and see if they can help you sort this out. You could most likely rewrite whatever Wordpress plugin to return a 301 instead of a 302 (or have a wordpress php developer do it for you - it should be a pretty simple job) but overall, it probably won't make much a huge difference in your SEO. But in general, you want to minimize your redirects when speeding up your site. For example, SEOmoz uses a CDN, but each static resource is linked directly to the CDN server, without the redirect. For my own wordpress sites, I use WP Super Cache and CloudFlare. Both are free and have sped up my sites amazingly well. In particular, I'm a huge fan of CloudFlare, given the extra security measures. Well worth checking out.

    | Cyrus-Shepard
    0

  • Hi Ross, Google is pretty good today at figuring out that your site is both "www" and non-www, and they typically do a decent job of giving you credit for both versions. In the old days (pre-2010) this wasn't always the case, and domain authority from one subdomain didn't always transfer well between the two versions, and this also caused a ton of duplicate content issues. That said, it's still best practice to redirect your visitors (using 301 redirects) to one version or the other. (My first ever Whiteboard Friday touched on this issue By doing so, you ensure that link equity is preserved throughout your site and you eliminate the duplicate content issues that result when two subdomain both resolve. Usually, a simple .htaccess file like the one listed above will do the trick.

    | Cyrus-Shepard
    0

  • I know this is a little late, but I figured I'd comment on this because I just actually stumbled upon your question in a SERPs for something like 'multiple rich snippets in SERPs'. I just posted in a Google+ community on something that I found where you've got the review markup, a recipe pic and this little mini author pic that all show up together. I had never seen this before... Check it out... https://plus.google.com/u/0/101946888512778592728/posts/9f2Xh8jnCzA

    | stevefidelity
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • I think separate domains would be the preferred choice in a perfect world, but of course we live in a far-from-perfect one! You're quite right to flag a potential duplicate content issue. Two separate domains may help you to rank in two different search engines, but this would require you to have completely unique content for both sites.  Navigation and layout can obviously be the same, but the content within your pages will need to be unique, which, depending on your industry, may simply not be possible (there are only so-many times you can reinvent the wheel, or re-write it as it were). With a localised site on a .com domain (you would be fine with either subdomain or subfolder, but if pushed, I'd pick a subfolder - www.domain.com/uk/page1.html for example) you may not need to rewrite as much content - as you can informational pages and other similar style pages sitting on the .com domain for all local subfolders to use, for example.  However, you will still need to rewrite other pages that would have similar content, such as products or services. If you have the time and resource, I'd go for two separate domains.  There's definitely a lot of data and results to support that a .co.uk domain will have an easier time ranking higher in Google UK over a .com (although not essential), which can help get you going.  It will require a lot of writing and re-writing, but if you get it right now, at the first stage, it would save any headaches in the future.

    | TomRayner
    0

  • Good stuff Steve, hope it all goes well. Just give me a shout if you have any problems.

    | TomRayner
    0
  • This topic is deleted!

    0

  • To remove tags? Yes, it's Meta Robots: noindex, follow Check that box and save options.  If you need to remove other types of duplicate content, such as author archives or date-based archives, also be sure to check "Post Types" and "Other" tabs.

    | MattAntonino
    0

  • You can also disallow your pages in your robot.txt file which will keep google out of them completely.

    | chris.kent
    0