Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Well, it's back working again and it filled in the missing days.   Can't complain about that.

    | Banknotes
    0

  • This video by Matt Cutts is quite (strangely) explicit: Cross linking to the others site's country/language versions is not a problem, and should not be considered a cross-linking scheme thing; It is better, though, not using the footer for those links, but the classic flags/country names selector. A good alternative for solving this problem is creating a page with all the links to the other versions of site, as the Apple site does (but, please, don't take into consideration the bad geo-targeting Apple is using in term of International SEO, in that sense it is not a great example). Why is that kind of page useful? It is because you can suggest it as the "x-default" URL in the hreflang="x-default" mark-up (here what Google says about this so forgotten tag).

    | gfiorelli1
    0

  • Dominik, whenever you change URLs (domain names) from one to another, there are going to be issues that come up like this. When you redirect one domain name to another you pass on all of the history, links, and soforth to the new domain name. When you redirected the old domain name to the new one you combined the links of both domain names. And the new domain name could have a link profile that is not very good (or that was worse than the old domain name). I would take a look at all of the links of both domain names and perform a link audit to make sure that there aren't any links that you need to get removed or cleaned up.

    | billhartzer
    0

  • Hi there, Paul and Kevin asked a great question. Did you see their responses? If you can give us some more details, we would love to help. Christy

    | Christy-Correll
    0

  • No.  The expires directive is merely there to tell the browser "this content is at least good until date X", so that the browser doesn't need to fetch it if it's got it cached. So let's say you've got a dynamically-generated page called widgets.php, and it pulls a list of products, latest pricing, availability etc. from the database.  Let's say you style the page with an external stylesheet.css, you've got some awesome wacky UI stuff in Javascript in an external wackystuff.js, and your images for navigation on the site are in /img/nav (but product images, which maybe can change more frequently) are in /img/products. You might NOT set the Expires header on your .php files and folders, as those should be considered to expire the second after the browser fetches them.  You might set the Expires header for the CSS file to a week out; same with the JS and the /img/nav folders, as those only change with a major update to the site.  You might set the Expires header date for the /img/products folder to maybe an hour or two, so that if a person is flipping back and forth between pages, the browser will use the cached product image for a little while before forcing a refetch of it. Make sense?

    | MichaelC-15022
    0

  • Adam, keep in mind that if that's low quality content... and you redirect those URLs to somewhere else, then you're passing any links and any potential quality factors over to the new URLs. When Google Panda came out, keep in mind that one of the only ways to recover is to delete the low quality content from your site entirely. So why would you want to redirect that low quality content and pass that on to another URL or another site? I agree with what the others here have said, Casey and Wesley. Think of the user experience.

    | billhartzer
    1

  • Was going to send you a PM Colin but I can't. Drop me a mail over to info@inetseo.co.uk and will give you a tip or two -Andy

    | Andy.Drinkwater
    0

  • Ah - okay, I'll see if I can do that. I'm not sure if I can though because of the way the ecommerce platform is setup. Yes, the sitemap has been corrected and has all non-https URLs in there. I mean, will google eventually "catch up" or is this a big issue that I should try everything in my power to fix? (It's not all links coming in via organic - just a handful but it's important as those are great leads). Thanks for your help!

    | TheBatesMillStore
    0

  • hi Oleg, We fixed it. We edited the textblock and added some prominent links. Now we wait for G;) Also we will continue to edit other page the same way.

    | wilcoXXL
    0

  • Ahh I see Steve. So there is a partial penalty to the site? If this is not the culprit, I would search out those that are. Detoxing is certainly an eye opener and you will most definitely find work to do with that. Andy

    | Andy.Drinkwater
    0

  • Wilco, If they crawled that page, the PA should be correct. Based on what I see external of Moz, it is. Robert

    | RobertFisher
    0

  • Thanks for the response Gregory, at the moment I am testing setting up a duplicate site on the country specific domain which all local visitors to the .com are redirected too. That site then has the meta no index, follow tag with hreflang tags and content languages tags for the country implemented, with the default tags for other languages pointing to the .com version. Worth a shot I guess.

    | Salience_Search_Marketing
    0

  • Gianluca, Rand's whiteboard Friday a couple of weeks ago may help you: http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls Though the Whiteboard Friday is about duplicate content issues, 1 piece you can probably us from it is this: embed an iframe on page of the content to leave the content out of the index and the content will not be perceived to be part of the URL when using iframe. Add “noindex” in the HTML doc in the iframe to be 100% sure that search engines do not index it.

    | khi5
    0

  • If you have an H2 on your page, maybe try to include same or highly similar wording in your meta description and try to avoid any sentences where relevant words to that page (or synonyms of such words) are not found on your page. I have tried with my own site in past and it has worked. No statistical evidence beyond what worked for my own site, but consider giving it a shot.

    | khi5
    0

  • Hi, Its easy to tie correlation to causation but you need to be sure and the addition of a new site section should not usually cause a traffic drop for existing pages. If there really is no cannibalization / duplicate content then first thing is to verify there were no other technical issues caused at the same time (moz crawls should show new issues cropping up). If nothing technical is apparent then I would look at what pages lost traffic and extrapolate from that (and your rankings) what keywords seem to have lost traction. If nothing else changed content wise over that time then have a look at what sites gained positions for these keywords (maybe some competitors upped their game) and also look at known google algo updates (maybe you got caught in one of them).

    | LynnPatchett
    0

  • Paul, Your answer was incredibly helpful - so much so that I'm going to endorse it. You not only pointed me in the right direction, you really helped me to feel confident that this will be a good solution for my client. Sincerely appreciate it!

    | MiriamEllis
    0

  • As far as giving them that nudge, Moz's info on redirection may come in handy.  It may be that they just aren't familiar with their options.  Always do everything you can to stay in the good graces of your IT people...

    | Chris.Menke
    0

  • Could you attach a screenshot? I think that could help us understand better what you are seeing. Thanks!

    | KeriMorgret
    0