Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thanks David I already had the redirect set up in the htacces file. I think the site was crawled before I had added this to the file…total "occam's razor" moment But now there is a few thousand duplicates because of the /catalogue /search etc… I disallowed that all in the robots file so now just have to wait another week!

    | adamxj2
    0

  • By providing Google Webmasters Tools with your international targeting information, you are helping Google decide if your website should appear and how it should appear (in local results) in a location. It only affects results for geographically related queries in which a user limits the scope of a search to a certain country.  It will not affect appearance in search results that are not geographically bounded. Hope that helps.

    | SEO-Buzz
    0

  • I agree with Heiko, it will not hurt your rankings. In this video you referred to, Matt says "all within the same domain"...so I guess you are concerned about having sub-domains. But sub-domains are still within the same domain.

    | SEO-Buzz
    0

  • Thanks Adam, for the feedback and suggestions. Have a nice day!

    | imaginex
    0

  • If you have change the domain, the only thing you can do is to 301 redirects the old domain to the new one and start promoting it and getting quality links on it. There will be a drop in traffic and rankings but it will be for a short span of time and it will come back again. Don’t revert to old again because it will not do much help as far as the rakings and traffic is concern, my idea is to create campaigns that actually work for you. Hope this helps!!

    | MoosaHemani
    0

  • Hi there, I've never seen this happen to a client so I'm only speaking from a theoretical point of view. My feeling is that it is possible for a penalty to be passed via a 301 redirect. One core reason is that if it didn't, then it would be a very easy way for low quality websites (who don't care about their brand / company name) to just keep switching websites. I don't think it will happen in all cases, but I certainly think it's possible. If this were to happen to you, my feeling is that you should create a disavow file on the new domain which contains the links to the old domain but go through the 301. This feels a bit messy but I don't see an obvious alternative. The absolute best outcome is to get the links totally removed which can be tricky sometimes. I found this blog post by Dave Naylor which talks a bit more about this subject if you'd like to read more: https://www.davidnaylor.co.uk/what-to-do-with-your-disavow-list-when-301ing-an-old-domain.html This Moz Q&A thread also talks about the same situation and has some additional information: http://moz.com/community/q/can-i-dissavow-links-on-a-301-d-website Cheers. Paddy

    | Paddy_Moogan
    0

  • Thank you I will try this and let you know how it has worked out

    | MikeAquaspresso
    0

  • You want those old spam pages to have a 410 code - gone (permanently). I'm not 100% sure how you will achieve this though... I'd speak to your hosting company and/or web developer. 404 code means the page is 'not found' which isn't the same as 410, which tells the search engines that the page has gone forever, so they won't keep looking for it. Hope this helps! Amelia

    | CommT
    0

  • The reason is that I need to update my website and that seems like an easy way to do it. My business partner can easily add content once I am finished. I do not want to use wordpress or joomla. I like that i can still keep the .html on my pages. The e-commerce part of it is so fabulously easy.

    | bhsiao
    0

  • Do they have authority? If not, feel free to 404 or 301 redirect to new homepage. If yes, that might affect ranking of the sites that they currently link to.

    | OlegKorneitchouk
    0

  • That is the way breadcrumbs function. Those links are the categories of the Aruba page. So if you make another page (e.g. Aruba climate) as a subpage of Aruba, you can add that category to the url and you'll see the Aruba breadcrumb (but not 'Climate').

    | OlegKorneitchouk
    0

  • Google just rolled out Penguin 3 so if you have a dubious backlink profile you will most likely have been affected.  You may want to look at cleaning up your backlink profile to get back on track. More info on the update here: http://searchengineland.com/google-penguin-3-0-worldwide-rollout-still-process-impacting-1-english-queries-206286

    | twitime
    0

  • We've had clients that have used most of same - Ultimate I think is the most used but Yoast is like #2.... They're all about the same - made for folks in the WP world who just aint coders....they do make ok title and meta desc changes...but lack any real 'go-for-it' tactical strengths.... Wouldn't recommend any of them, really....so much easier to code in php for your own usages....

    | JVRudnick
    0

  • It's not ideal to have such a massive .htaccess file that it slows down your page load time significantly.  But if you have a lot of inbound links to pass that matter, you'll likely want to keep your SEO value intact and use 301 redirects to handle this properly. My $0.02: Test! Do a page load test with the .htaccess file off / removed, and then do another one where it is on and live.  If there's no significant time difference, you should be okay. We have sites with hundreds or even thousands of lines in the .htaccess file and they run pretty quickly. That said, here's why 404 pages aren't ideal to serve: According to Rand Fishkin's Moz blog writeup, Are 404 Pages Always Bad for SEO? http://moz.com/blog/are-404-pages-always-bad-for-seo "When faced with 404s, my thinking is that unless the page: A) Receives important links to it from external sources (Google Webmaster Tools is great for this) B) Is receiving a substantive quantity of visitor traffic and/or C) Has an obvious URL that visitors/links intended to reach It's OK to let it 404." According to Moz's Redirection SEO Best Practice: http://moz.com/learn/seo/redirection ... you want to use a 301 redirect to indicate that the content has moved permanently. Finally, here's a post that describes how to create a more SEO friendly migration, here's a great info graphic: http://moz.com/blog/achieving-an-seo-friendly-domain-migration-the-infographic Hope this helps! Thanks, -- Jeff

    | customerparadigm.com
    0

  • Actually not a lot, all these tools parse your HTML and preview you with the page that they load it with. The difference in using the tools is that they'll check the page with a different user agent. So if you do any check on this on your current page it will tell you how the page looks like.

    | Martijn_Scheijbeler
    0

  • Yeah, practically every site out there gets those so I would look at the other reasons for the traffic drop. If it's really bothering you then just domain:askives.com each of those and submit them.

    | DennisSeymour
    0

  • What Tim said. And be sure to add it again on Webmaster tools

    | DennisSeymour
    0

  • If you want to keep any SEO value from inbound links that go to any of those older pages, you'll want to 301 redirect them to a similar page (i.e. a top category page). According to Rand Fishkin's Moz blog writeup, Are 404 Pages Always Bad for SEO? http://moz.com/blog/are-404-pages-always-bad-for-seo "When faced with 404s, my thinking is that unless the page: A) Receives important links to it from external sources (Google Webmaster Tools is great for this) B) Is receiving a substantive quantity of visitor traffic and/or C) Has an obvious URL that visitors/links intended to reach It's OK to let it 404." According to Moz's Redirection SEO Best Practice: http://moz.com/learn/seo/redirection ... you want to use a 301 redirect to indicate that the content has moved permanently. Here's a post that describes how to create a more SEO friendly migration, here's a great info graphic: http://moz.com/blog/achieving-an-seo-friendly-domain-migration-the-infographic Hope this helps! Thanks, -- Jeff

    | customerparadigm.com
    1

  • thanks. I completely forgot about the robots.txt method of fixing it. Will give it a go. I will add comparison in there too just to be safe.

    | daedriccarl
    0

  • Hi There 1. It could still be in the index because they are 302 redirect and not 301. 302 is temporary, and therefore Google may not de-index those URLs. It also takes time. I've seen Google take months to noindex redirecting URLs. Also, make sure you are not blocking crawling of the dev site, or Google will not see the redirects. 2. I am not sure how they got there to begin with. I pretty much always can find some sort of error - maybe someone tweeted a staging URL, maybe crawling wasn't blocked, maybe there was one link to staging from the live site etc etc. Regardless - somehow Google crawled it To prevent this in the future always block crawling of staging servers well before you ever put anything on them. 3. Usually Google tries to sort this out. They won't give you a penalty for "technical" duplicate content (penalties are more for "malicious" duplicate content ie: stealing people's content). So you won't get penalized, but the more you can help Google out by sorting it out, the more time Google can spend crawling the correct site etc. What I would do now is, if you do want the staging URLs to redirect (which might not be the best solution if you want to ever go back and work on the staging server again) - but if you do, use 301 redirects and make sure you are allowing crawling of the staging site. Keep it registered in webmaster tools and this way you can monitor the indexation levels.

    | evolvingSEO
    0