Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.

  • This topic is deleted!

    0

  • Google's crawl rate varies based on many factors. For example, the New York Times website (nyt.com) probably has it's home page and category pages crawled a few times each day. A website such as yours which has static content, offers few links and has low DA may require a few weeks to be fully crawled. You can check Google's cache date for your pages on any search result page. Near most results you will see the word "cached". Click the cached link and you will see a header which offers a date/time stamp of when the cache was updated.

    | RyanKent
    0

  • If your developer will be making the website dynamic via a system like WordPress there will be automated ways to keep your sitemap up to date every time you publish a new page to your system and then it will even ping the search engines that the sitemap is updated It will be a "set it and forget it" type of thing with sitemaps if you are moving in that direction Good luck!

    | connectiveWeb
    0

  • James, me itself i work on OEM for Long Time, i tried will miniblogsystem like tumblr,typepad,posterous etc.. But it Takes Time to rank higher on google German. The negative article is from Forum, with Da:49 and my Sites have da:40 My Goal is to increase the Domain Authority through backlinks on Same niche for my Brand Domains, than i believe i can outrank the nnegative One. I started ugc System for my Brand Domains, i Hope can outrank in Short Time. Any idea? I down the serp 9places from my Brand Domains, but negative is Position 5 on First serp. Thanks

    | leadsprofi
    0
  • This topic is deleted!

    0

  • EGOL, I would have responded directly to your response, but found yet another bug between this site and the iPad. I wrote to tech support, but the response was, well...., not so favorable. To answer your question, yes I tried it. The short story is it takes 15 or 16 keystrokes to accomplish what should be one keystroke. I don't want to rock the boat too hard about the iPad issues because I'm a newb. I think it's an easy fix. If any administrators monitor this, feel free to tap my shoulder at MOZcon. I would be happy to help get it fixed. This is a great forum and I want to thank everyone for sharing their wealth of knowledge and experience so freely. I will try to reciprocate whenever I feel my comments could be helpful. Thanks Again!

    | dmac
    0

  • I can't imagine myself running that many sites with the same content.  My approach would be to make one big asskicking site that would beat all of the competitors in the various markets. But if you are going to have multiple copies out there... I would write the Bible about a topic and place it on my main site.... and the other sites would have summary articles that promote the Bible that can be found on the main site.  I would use rel="canonical" on the small sites as described  by Ryan.

    | EGOL
    0
  • This topic is deleted!

    | GOKAM
    0
  • This topic is deleted!

    0

  • Thank you Ryan and Ogy. Those answers are much appreciated.

    | jfeld222
    0

  • Artience Girl, the information shared by Shane, Aaron and Lewis is correct. Google wants to see the same page as it would be shown to a user under the same circumstances. If Google is crawling your page from San Jose California, then they want to see what a user from San Jose would see. If they decide to later crawl your site from their center in London, they want to see your site as it would be seen by a London user. The geo-targeting redirects you are presently doing are fine. If you were to write any code which says to always show the Google crawler the US version of your site, then that tactic would be defined as cloaking. Any time you write code to specifically identify a crawler and show it different content, then you are cloaking. It seems you are a bit uncomfortable with the answers so let me set you at ease by sharing a Matt Cutts response to your question: http://www.youtube.com/watch?v=GFf1gwr6HJw

    | RyanKent
    0

  • Unfortunately you can't change crawl settings for Google in a robots.txt file, they just ignore it. The best way to rate limit them is using custom Crawl settings in Google Webmaster Tools. (look under Site configuration > Settings) You also might want to consider using your loadbalancer to direct Google (and other search engines) to a "condomised" group of servers (app, db, cache, search) thereby ensuring your users arent inadvertantly hit by perfomance issues caused by over zealous bot crawling.

    | RichardVaughan
    0

  • If all 301s have been removed then this is how I would see the end result: examplesiteA.com is a brand new site sitting on an old domain. examplesiteA has many links to it whereby users are expected to see the pages which were there but are not moved to another site. The biggest issue is a percentage of the linking sites may realize the change and break their links. If the links are maintained, then I don't see any reason for dilution or devaluation. examplesiteB.com is the old site from the "A" location. When the site moved, the 301s were in place. Assuming Google crawled every page of the old "A" site and found all the 301'd pages, then they are aware of the moved pages and have updated their index. The challenges is, this move clearly kicked up a lot of dust. How long will it take Google to fully index both sites? Until that happens the rankings may bounce. There is a SEO theory that links increase in value with age. I disbelieve that theory; therefore, I don't believe you would experience any link dilution. The only issue would occur if a user expecting to find SiteB at the end of a link now finds SiteA, and therefore removes the link due to not being satisfied with the changed page.

    | RyanKent
    0

  • This is actually a best practice IMO. If that page for whatever reason is duplicated somewhere else (could happen a year down the line) then you automatically eliminate the issue of dupe content and are already providing a directive to Google on which page is the original and which page to count.

    | MarcLevy
    0

  • Hi Joe Thanks for your advice, however, if you look at the websites, gansbaai.com it is really bad, only has 1 page, I really only bought it for the domian age and couple of links it has (and because I want to launch my new sit under this domain)....so there are not enough pages to 301 all danger-point, and I want to loose as little link juice as possible....Any feedback would be greatly appreciated. Thanks

    | DROIDSTERS
    0

  • I'm sure there are some people that do prioritize their sitemap manually, but not me. I don't think the priority setting is THAT important!

    | AdamThompson
    0

  • Bots can burn through script... I don't think an extra line is  really a speed bump. .. but yes... always fun to test things.

    | Thos003
    0
  • This topic is deleted!

    0

  • Lots of great feedback has been offered. In short, it's up to your personal preference. I can't help but add a link because I have watched too many Matt Cutts videos (they are starting to auto-play in my head) and he answered your exact question. http://www.youtube.com/watch?v=971qGsTPs8M

    | RyanKent
    0

  • Make sure you redirect the existing pages to new URLs using 301 permanent redirects. This would help retain the value of page to some extent and the existing visitors as well. Defining everything in a .htaccess file would be the best option, and can also use a code in a PHP file to further manipulate the redirects.

    | Develop41
    0