Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thanks Mike..great instructions to 301 via .htaccess file.

    | johnshearer
    0
  • This topic is deleted!

    0

  • If you download the crawl report to Excel, one of the fields is titled "referrer". That field shows the page with the broken link.

    | RyanKent
    0

  • Ryan your analogy is fantastic. I totally understand this now and it really makes sense to do it this way. Thanks for being patient with me Again thanks all for your feedback on this. Kind Regards

    | Paul78
    0

  • What this looks like to me is somone has published your site to a virtual directory within your site. this is easy to do with a IIS server. can you log in and see the file structre or the iis control panel?

    | AlanMosley
    0

  • This post from Rand about the "wrong page" ranking may also help. http://www.seomoz.org/blog/wrong-page-ranking-in-the-results-6-common-causes-5-solutions

    | KeriMorgret
    0

  • really appreciated your time and advice. i respect your opinons. actually i just wanted to give a example (target) for why we are using all these subcategories at the top. we are not satisfied with the google rank and this is actually a new design of our site. its been just a week days since we are using this new system and the template. by considering your words now i am rethinking and surely will need to take actions.

    | idreams
    0

  • I do you have Google Webmaster tools installed. If you are penalized for say buying links or cloaking or something like that. Google would send you a message notifying you of the penalty.

    | Firestarter-SEO
    0

  • I have thought about that but what criteria would you use to decide if it is worth the $8 bucks for the renewal. I own a couple hundred domains and am also trying to also figure out whether I should stop renewing them and let them expire or 301 them to one of my industry related websites.  I understand that seasoned URL's that are keyword specific can add small bit of value, but how much value.... $8 worth per year?

    | FidelityOne
    0

  • Hello Richard, That is a great question and I'm impressed by your attention to detail with regard to page-rank distribution changing as things go in and out of stock. To answer your question, I don't think you risk being penalized for displaying in this way any more than thousands of other sites, including huge brands, risk it by using drop-down divs (e.g. "read more" , "transcript") and tabbed product description areas (e.g. "sizes", "description", "technical details", "Shipping costs") to break up the pertinent information into bite-sized chunks for the user. I work on a site that has checkboxes the user can uncheck to hide certain items if they don't wish to see them. This all uses similar coding to what you have described. As long as you never specifically target Google (as in say "If Googlebot, then show this content, else show this other content) I think you'll be fine. With that said, you may want to look into using a View-All rel canonical page to take care of that page-rank distribution issue you mentioned, depending on how it impacts the load-time of the page and how many links you will be sending part of your page-rank to: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html . If it were me I'd just stick to the solution you first asked about, but there are plenty of options. Also think about the UX when a visitor lands on the out-of-stock product page. All it takes is a few quality raters or a few hundred organic visitors who land on that page while it's out of stock to give it a bad rating or a fast back-click to the SERPs and you could find yourself battling the effects of Panda, at least as far as I understand the process. Some options to improve that user experience include: Estimated date that the product will come back; ability to backorder; ability to sign up for an email alert when it gets back in stock; related product links with images. Good luck! Everett

    | Everett
    0

  • You can block on server side all IP except Google bot for any file, but it may lead to ban, because of cloaking.

    | de4e
    0

  • Thanks Yannick - really helpful cheers and have a good weekend Bush

    | Bush_JSM
    0

  • What does a Ruby site map look like? Surely Ruby just runs something like 'gen_url_list' and you can output it in sitemap.xml, no? Ignore that, ROR is an XML format, lol. Eh, no search engines support them as far a I know. Why not test it by submitting one through WMT and seeing if Google accepts it and let us know the results?

    | StalkerB
    0

  • Just my opinion here, since you have some great answers already: I might be completely alone on this, but I definitely would encourage testing. Mock up some SERPs and do some user testing with Mechanical Turk to judge if it looks spammy, enticing etc.. Here's an example from Australia - Look for Travel Insurance Direct (should be #1) http://dis.tl/vph36j I think it might be the only way we can add our personal touch in SERPs and I love creativity. Good luck!

    | DaveSottimano
    0

  • Thanks Yannick, much appreciated.

    | stevecounsell
    0

  • Wow Ryan. I was just lurking here and what you just explained is awesome! We are all looking for those C block IP addresses, but Google wasn't born yesterday. Funny, up until I read this I had no idea that this was even a possibility. Thanks so much. Now back to do some SEO sleuthing! Cheers!

    | NewGlobalVentures.comSEOTexas
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | tolik1
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | Lito
    0