Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thank you Peter. I appreciate your useful feedback.

    | DavidSpivac
    0
  • This topic is deleted!

    0

  • Thank you Alex, Great answer. Cheers, David

    | DavidSpivac
    0

  • Lets say you have ugly url page.aspx/param=value the ugly url works it renders a page using the parmeters in the url. having a pretty url rewite to the ugly url will still renders the page the same way. but then you have both the pretty url and the ugly url that can render the page casing a cononical issue, so you much 301 the ugly url to the pretty url See url rewiting half ay down the page http://www.seomoz.org/ugc/microsoft-technologies-and-seo-web-development

    | AlanMosley
    0

  • Hello Zeal Digital, I use a CDN (Content Delivery Network) for images, CSS and javascript. Doing that adds only about $10 to cost per month for a site that had around 800,000 pageviews per month. You have complete control over the images. If there is a problem, you can force the CDN to flush a file and reload it from the source. You add code to your .htaccess file that tells the CDN how long to store images before fluching them and getting a new copy. It is all automated, there is generally no work for you to do. I host with softlayer.com and this is part of their service. The change from self-sourced images, css and scripts had a massive improvement on the server. it is a 16-processor linux box with twin 15,000rpm SCSI drives and 12Gb RAM - it is quite fast! Page delivery times improved by 1-2 seconds. The server now is so lightly loaded that it could be downgraded to save more money. It has zero effect on SEO. The CDN is accessed using a CNAME. -   static.domain.com  - so don't worry about it looking like components are from other places. The CDN has servers all over the world, so no matter where the visitors are, it is only a few hops for them to get most of the content, making it much faster for someone in Australia who would normally pull images from a server in the USA. Your only problem with Amazon S3 is that they have crashed it a few times, but other than that, it is a good thing to do. I wouldn't advise them to self-host, unless you want to increase their costs, server loading and page delivery times.

    | loopyal
    0

  • Thanks Megan, I figured it was something like that since the errors went away without any further action.  Thanks for putting the issue to rest for me. 

    | Banknotes
    0

  • Hi Mike, The rel=canonical information is in the notices section. It's just a notice to say "hey, it's here, check and make sure you have it set up right". I took a look at your account and at first glance it does look OK (you don't have everything set to the home page and you're taking care of sort parameters). It's not a warning or error, just a notice, and all appears to be OK.

    | KeriMorgret
    0
  • This topic is deleted!

    0

  • You could refresh your sitemap, then re-submit it to Google webmaster tools. This usually works for us

    | CaseyKluver
    0

  • OK it looks like your site uis coldfusion, yet your headers suggest it is asp.net Here is a see a cold fusion page http://www.franchisesolutions.com... If you look at the cacheed page in google it looks like some sort of trace page used fopr dubuging, it may be that goolge cloweled your site while you were debuging

    | AlanMosley
    0
  • This topic is deleted!

    | Jay-T
    0

  • If the cron is working then I would personally turn to the other forum to see if anyone knows a way to rope those messy URLs in and get them under control. I try to avoid manually generating and updating sitemaps whenever I can, because it's a hassle on a small site, not to mention the trouble on an ecommerce site. If your site is going to stay that small, then a manual sitemap might be less of a headache for you than customizing Magento. I would worry about keeping a clean sitemap. If the search engines learn that you keep a messy sitemap, they will rely on it less and less. 404 & 500 codes especially, but also redirects and perhaps duplicate content. For Further Reading: Google Sitemaps Ask For Clean URLs - http://www.johnfdoherty.com/google-sitemaps-ask-for-clean-urls/

    | KaneJamison
    0

  • That makes sense. So is that hurting/helping our SEO? We have a 301 redirect in place on the www. so I imagine it shouldn't be big deal.

    | askotzko
    0

  • Its very hard to hide content farms, the fact that you have domain privatcy means little and is a pattern in itself. there are so many ways to spot a content farm, Dan from Dejan SEO wrote a good article about this http://www.seomoz.org/ugc/why-link-schemes-fail but the good news is, for 5 articles its not going to be a big problem.even on the same ip, 5 is not a big number. Take discountAsp, one of the biggest hosters, has most of their sites on one ip address, so the chances of getting links from the same ip are quite good, now if you have 1,000 links i would start to worry.

    | AlanMosley
    0

  • Hi Robert Yes, language is kind of a barrier in this case And yes #2 are doing very well. They have a lot of content. And content is king. But we have our brand name and travelling is not just travelling (guides, quality, transfer, flights)  - so i think many users read their content, but order a package in other companies, like us. They are focused on the DIY-group of travellers. Danish is actually such a small language that only 5 million people are fluent. Its amazing we can keep up the company running with a danish website ... I just found another funny thing about this "rejser til gran canaria" keyword. On SeoMoz's ranking report i clearly show as "rank 1" on Google and "rank 2" on Bing and Yahoo. So SeoMoz actually read the data that is provided as we should be number 1 on that particular searchphrase ... Under the help-section SeoMoz writes that the ranking shown on page can be different from what i experience using google due to personalization. But even when i'm logged out from google and have flushed my cache i end up #9 on google. SeoMoz suggest adding the "&pws=0" to the google url to prevent any personalization. But that doesn't help either. So something is not rigth! Regards, Alsvik Oh - and sorry for the name switch. My name is Alsvik. The old name reffered to the creator of the account.

    | alsvik
    0

  • These are often hard to diagnose, but you have a few options for digging out the source of these links. First of all, you want to make absolutely sure they aren't coming from your site. Unfortunately, most of the time we usually miss something when removing old pages, and our own site is the cause of the problem. 1. Check Google Webmaster Tools - In the same place they list the 404 errors, they will often tell you where they found the page under a column named "Linked From" Here's a screenshot: https://skitch.com/cyrusshepard/8jrqx/webmaster-tools-crawl-errors-http-truefabrications.com Clicking on the pages listed will often uncover the source of the link. 2. Try Screaming Frog or Xenu to crawl your site and see if the RSS feeds or links appear. 3. Or one of the SEOmoz crawlers, such as the PRO web app or the Custom Crawl tool. Unfortunately, if the pages aren't coming from your site, (they could be coming from old feeds that others scraped a long time ago) then about the only thing you can do is file a removal request with Google. This can be a slow and tedious process if you have a number of pages. Hope this helps. Best of luck with your SEO!

    | Cyrus-Shepard
    0

  • This is correct. 301s are done all the time, and it's the way to tell the search engines the new location for the content.

    | KeriMorgret
    0

  • Are you saying that what had been identified by the SEOmoz tools as a valuable link that is on your site is no longer identified as a valuable link? Or has the link on your site itself disappeared? If the link itself has disappeared, you need to investigate with your IT department. If it's that SEOmoz no longer identified the link as strong, send an email to help@seomoz.org and ask our help team about it. Did that help, or did I misunderstand your question?

    | KeriMorgret
    0

  • I work for a company that closely mirrors the dropdowns employed by a very large ecommerce site - http://www.surlatable.com/. Upon analysis of this URL, there are well over 100 links on that page because of the way their dropdowns are designed (over 400 in fact). However, upon close analysis, only 52 of these internal links are followed, and 370 internal links are not followed. Based on this, I would recommend if you have well in excess of 150 or so links, use nofollow for those less important sub categories like this site has. (I'll have to change the structure of our site now...)

    | bradkrussell
    1

  • I have seen several highly competitive verticals where a lot of exact match anchor text links are really effective even if they are from lower quality sites. I don't think this will work forever but I think it works too well in many cases in the present. I like to think that quality links are the way to go, but I've seen a lot of cases where a high number of low quality links are winning.

    | ProjectLabs
    0