Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.

  • This topic is deleted!

    0

  • Dear all, what is the best option? And are the option below good? A: Disallow sort-order (Only URLs with value = asc) "A single URL may contain many parameters for each of which you can specify settings. More restrictive settings override less restrictive settings. For example, here are three parameters and their settings" source: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687 B:  User-agent: Googlebot Disallow: /*.=name$ for example www.sub.domain.com/collection.html?dir=desc&order=name source: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449 Thanks!

    | HMK-NL
    0

  • You are right Mat. I interpreted the question to be related to the domain age which may not have been what Feilim was asking about. On the subject of domain age on rankings, here is another Matt Cutts video which may be helpful: http://www.youtube.com/watch?v=-pnpg00FWJY

    | RyanKent
    0
  • This topic is deleted!

    0

  • Thanks a lot for your answer. I remember few years back indeed when Alexa was not that famous, not many people bothered with their rankings, it is just recently that there is alot of talk about them. best regards.

    | sherohass
    0
  • This topic is deleted!

    0

  • Do you have any results yet Cyril? Curious as to how this is going.

    | KeriMorgret
    0

  • Google Webmaster Tools is handy to use in conjunction with the other tools listed. Also, SEO Powersuite's SEO Spyglass tool is pretty decent.

    | David_ODonnell
    0
  • This topic is deleted!

    0

  • That is true, Keri -- Thank you for clarifying. The client had a site with ecommerce store from SBI!.  They wanted to change to a different platform and in the process created the site with a new URL.  The first site, however, had many links and was ranking for most of their keywords.  That site is still now appearing above the new site in searches. Then we entered, and would like (obviously) to 301 redirect all the pages from the initial site to the new one, so that they don't lose what they had built up.  But SBI! says FTP access is not available and 301 redirects are impossible. We're looking for a creative solution around that.  Thanks for any help someone can give us.

    | roundabout
    0

  • On your cdn, you have an option to disallow se's to index files on subdomains by robots.txt so you can keep a copy on cdn.example.com as well, and noindex it via robots.txt for better  user experience

    | imfkhan
    0

  • Just a thanks again!  I modded ISAPI Rewrite and resolved the problem thanks to your tip. FYI in case anyone else needs it, here is the mod I did: #redirecting www.www.domain.com to www.domain.com also redirect the net and org #RewriteCond  Host:  www.www.domainname.(com|net|org) #RewriteRule  (.*)  http://domainname.com$2 [R]

    | CredA
    0

  • Just a quick note -- I've seen Google index PDFs that were scanned images of a cut-and-paste newsletter from the 1980s with a variety of different fonts. This is not a guaranteed way to keep Google out, and images will also make your files much bigger than just text.

    | KeriMorgret
    0

  • Unfortunately, what Google can do and what they actually do can really vary. If you're mixing signals and doubling up the canonicals, the odds that they'll get it right are pretty low, in my experience. I think it's worth the fix, if these are traffic-generating pages.

    | Dr-Pete
    0

  • I would recommend reformatting the Urls for  pages that have lost rankings. Then, just choose a few of the pages that retained their rank to try hyphens instead of + sign. Track them. If they improve, then do the rest of them. If they decline, remove the 301 redirects and return them to what they were and leave the rest of them as is. Dana

    | danatanseo
    0
  • This topic is deleted!

    0

  • Hi Greg, I have a follow up question. Because these pages are being dynamically generated from the base URL, the only place to put the meta no index tag would be on the parent URL, which of course we don't want to do because it needs to be indexed. Can we add these individual URLs [just the ones with origin codes]  to the robots.txt file? Dana

    | danatanseo
    0

  • Hi Mark, That's an interesting statement about the keywords tag being more important than the robot tags. Google has said they totally ignore meta keywords. Can you provide a little more information about the source of this information?

    | KeriMorgret
    0

  • Sunita is right on this one. There's more benefits from getting likes to a variety of pages rather than the homepage.

    | Audiohype
    0

  • Hi... nope... if there is a custom description in the post its self it uses that instead of the template. -Dan

    | evolvingSEO
    0