Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • The part you asked about is great. I do notice some other URL issues you might want to take care of. The home link goes to http://www.highstreetvouchers.com/gift-vouchers/index (note the index at the end), so you have a duplicate there. Your special occasions link has a .jsp at the end, where most of your URLs don't have the .jsp. I do see some pages that are duplicates and show both with and without the .jsp.

    | KeriMorgret
    0

  • I agree with Nicholas and Theo. A website's code should be clean and efficient. With that said, I'll specifically answer the question and say that No, the javascript you described will not have any negative impact on SEO except as it pertains to page loading speed. I would suggest using PageSpeed or a similar tool and track page loading efficiency. Trade with and without javascript enabled. If the difference is very small, then there is not any negative impact to SEO. If the difference is larger, then there COULD be an impact.

    | RyanKent
    0

  • I really don't think that the answer is in these numbers...  I would simply... -- improve content -- sharpen the optimization -- get more backlinks -- make social sharing easy

    | EGOL
    0

  • Not sure whether they pass value or not, but a likely reason to do this is to keep track of click-outs. By passing the links through the internal system they can view what links are clicked and how many times.

    | Theo-NL
    0
  • This topic is deleted!

    | gadawg
    0
  • This topic is deleted!

    0

  • Thanks, Malachi! I think the issue came about when we moved the site to the cloud. It looks like the old site was inadvertently moved too. Best, Lisa

    | lhc67
    0

  • It's true that subdomains are not internal links. But if they are your subdomains they are trusted links. When you place a link, your PR from that page will automatically flow to it. The only question you have to answer is, now that the link juice has been spent, do you want the link's target to benefit from the link and let your PR to flow to the page? Can you share a context of where or why you think it would be beneficial to use nofollow on a link to one of your subdomains?

    | RyanKent
    0

  • Fantastic. Thank you so much.

    | JetBookMike
    0

  • Hi Maria, Your menu is not optimized, but the real problem is not coming from there. Your home http://www.medixschool.ca/ has a meta refresh leading to http://www.medixschool.ca/Home/index.php instead of a 301 redirect. You should remove this meta refresh and replace it by a 301 redirection. Best regards, Guillaume Voyer.

    | G-Force
    0

  • Anyone heard anything about this yet?

    | findachristianjob
    0

  • If you are trying to make more content, always think of the end user first. If you are doing a history, make something unique like a timeline in an image. That is what SEOmoz does. I used to copy and paste many years ago, but whenever I write original content, I get results. My recommendations are: copy the content into notepad & rewrite it yourself OR 100% original content,

    | Francisco_Meza
    0

  • Your present approach is correct. Ensure all these pages are tagged as noindex for now. Remove the block from robots.txt and let Google and Bing crawl these pages. I would suggest waiting until you are confident all the pages were removed from Google's index, then check Yahoo and Bing. If you decide that robots.txt is the best decision for your company, then you can replace the disallows after confirming your site is no longer affected by these pages. I would also suggest that, going forward, you ensure any new pages on your site that you do not wish to index always include the appropriate meta tag. If this issue happens again then you will have a layer of protection in place.

    | RyanKent
    0

  • Mmhh.... on your mentioned link they write "... will eventually allow embeds to work on mobile devices" .... in my case they don't allow apparently. But thank you for your time anyway

    | petrakraft
    0
  • This topic is deleted!

    0

  • Okay, and thanks Alan!!

    | azguy
    0

  • Many thanks - I'm going to have a serious word with my coders!!

    | James77
    0

  • Antony, If your site structure is to remain the same use Google's Step by Step Guide to Migrating a Domain Google recommends redirecting all pages. After you follow Google's suggestions on the move there are things you can do to speed this up. Tweet every URL on your new site that you want indexed. Check the site: indexation for inclusion of the new pages and don't forget the sitemap and Google Webmaster Tools for your URL preference.

    | blackballonline
    0

  • I can't provide the exact url, but the dynamic content is a customized search result. User data seems to show that users prefer the content. The CTR is very high as is pageviews, and low bounce rate and they are some of the top pages in GWT. We would de-index all of the urls, but they drive a large percentage of traffic to the site. Thank you.

    | nicole.healthline
    0

  • I think some of the confusion may be due to Google's primarily using IP addresses assigned to their headquarters in Mountain View, California. Google has many (around 20) data centers located outside the US. I recall reading an article whereby at times they used their Mountain View IPs from centers around the world. For security reasons they do not wish the location of all their data centers to be known. I researched this topic before and I was unable to locate any official information from Google. It would only seem reasonable they crawl from all over the world. If they didn't, then a lot of sites which use geo-based targeting for site navigation would not have most of their content indexed. While it's true a sitemap could be used to overcome the issue, many sites do not use sitemaps and they still get indexed.

    | RyanKent
    0