Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Yes, better to 301 them to their most relevant pages. If they are pages with little to no traffic and no pr and have no relevant similar page it's better to just let them 404. Google is fine with pages 404ing and dying out. Just don't keep them in your site navigation  and remove them from your sitemap.xml files and anywhere else you might be referencing them from within your sites.

    | irvingw
    0

  • I don't think it's a problem providing all the links on your site are up to date with the new URLs and the sitemap.xml's are also up to date. Google will respider and reindex your site with the new URLs and drop the old ones. You only need to keep the pages which have external links pointing to them 301ing. All other should just get reindexed and Google won't report hundreds of thousands of 301's if the URLs in your code are all up to date. You don't want your site to have a navigational structure that consists of 301's, and you definitely don't want to daisy chain 301's.

    | irvingw
    0

  • Hi Kristian, Sorry this is a bit late - logged off when you posted this (I'm in the UK) and only just had chance to get my 'Moz fix "non-pers google.com" just referred to using the SEOMoz Chrome Toolbar to do a 'Non personalised' check on Google.com for your keywords. If you want to check rankings compared to your competitors, use your SEOMoz Pro account (if you have one) or if you need something large scale and custom, get in touch with AuthorityLabs. As for the no-follows, yeah - it would make sense to me. Again, depends on the exact situation but sounds like a sensible thing to do... Matt

    | mattbeswick
    0

  • Google has a really good post on this question . Link

    | SEO5Team
    0

  • Setting up an internal author bio would be, in my opinion, a credibility enhancer for the particular article/story; however, in order to take advantage of Google's Authorship feature in the SERPs, the author would need to have a Google+ profile. So essentially, I don't think setting up the profiles would have much (if any) benefit from a search engine perspective. If the authors set up a Google+ profile in the future, you could always integrate it at a later time.

    | edwardrj
    0
  • This topic is deleted!

    0

  • Thanks, I'm actually not concerned about the quality of the articles. I think my old site is in trouble for having too many keyworded backlinks and too many syndicated articles, but the articles themselves are good.

    | philray
    0
  • This topic is deleted!

    0

  • I do not know but your website is not opening – http://www.howtomaketutus.com/?p=home [showing error] http://www.howtomaketutus.com/ [getting redirected to - http://www.howtomaketutus.com/?p=home] _First fix this issue. As you have not mentioned anything about receiving mail in Google Webmaster Tools account, I am assuming that you are hit by algorithm penalty. In that case, you need to get rid of as many spammy links coming from low quality article directories, directories, link pages and link networks. When it is done, you need to use Disavow Tools to include those spammy links where the link references still remain. You need to wait few weeks before you can start seeing any difference. _

    | Debdulal
    0

  • Hi Damien Yes, Luis is right about the frequency of PR updates in Google Toolbar. Before 10 Days only google has updated the PR. Here is the link : http://www.seroundtable.com/google-pagerank-update-15922.html The previous PageRank update was done in August 2012. Hope this will helps you.

    | SanketPatel
    0

  • your welcome n_n

    | Valarlf
    0

  • Thanks Paul. You confirmed our understanding. Another head is always better! Thanks again. Martin

    | MartinH
    0

  • Sorry I missed your followup question on this, Diane. I would say the original server mentioned is still the better choice. The Xeon processor in it is specifically designed for server use. The i3 processor in this one is the 3rd tier of Intel's consumer processors. In addition, the original is a name-brand Dell built with components specifically for servers - motherboard, power supply etc This is important because servers are a much higher-stress environment than most consumer-level computers. Also it has a RAID array which is of major importance in critical servers. i.e. if you lose money when sites are offline. The system you just listed looks to be a "white box" system - a system assembled by the hosting company using whatever parts are most economical. Doesn't mean it's a bad server, just that it's much harder to know the quality of the components. The one thing this last server has in its favour is that it's got 50% more RAM. Good for heavy server loads. But in my opinion this doesn't outweigh the other advantages of the first server. (And you can simply upgrade to more RAM for the original server if and when your websites' needs require it.) All that said, the hardware isn't the only thing by which to a dedicated server should be judged. The quality, speed and redundancy of the backbone connections to the Internet, quality and speed of tech support, turnaround time for hardware repairs are all critical as well. Hope that helps. Paul

    | ThompsonPaul
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • Thanks for that. i have notified them and it is running faster, so i will keep an eye on it. one of my major problems at the moment in the speed tool is it keeps coming up with  Specify image dimensions  which i have done in my joomla website and it also comes up with Optimize images, which i do as i use photoshop so i am a bit puzzled why it is saying these two things. But i am happy the speed is faster now after getting a new server, will keep testing it over the next few days. thanks everyone

    | ClaireH-184886
    0

  • I Have an ecommerce site hosted on volusion and they make the structure /category/product (Specifically www.example.com/shortnameurl/productcode"). I figure with 10000+ sites they have found that this structure is best. They want the best results for their clients so they retain & gain new busines. I recently tried to duplicate(ish) the product name like /black-luxium-camera/luxiumfz200 ... and google killed me. My suggestion, stick with /categoryORbrand/product,  NOT .com/product/   as there is more opportunity to stand out when people search.

    | longdenc_gmail.com
    0

  • Actually this is the case. The content is the same but it is translated to different languages. The problem is that my crawl report is saying it is double content and it seems that Google is penalizing me for it.

    | Exp
    0