Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thanks SEOStallion. Sean

    | Yozzer
    0

  • The best course of action is to 301 all of the pages as: There may be some links to these pages These pages may presently be carrying some authority which would be transferred to the new pages (i.e. where the 301 resolves)

    | SEM-Freak
    0

  • Thanks both for your responses.  It's a strange one and I can only assume that these pages remain in Google's index - I have checked many link sources and found that the links do not exist and therefore haven't done since the page was deleted.  It seems ridicilous that you should have to 301 every page you delete, there are literally 500+ of these phantom links to non-existant URLs and the site is changing all the time. I have opted to add a 'no index' meta to the 404s and also encourage them to delete from index by adding the pages to the robots.txt file. Let's see if it works - I'll post on here when I know for sure so other people with the same question can see the outcome. Thanks again, Damien and Steven.

    | RiceMedia
    0
  • This topic is deleted!

    0

  • 301 redirects still pass link juice, just not quite all of it. If the content of the pages remains the same, you should still be fine, you might see a slight slip but not much.

    | DanDeceuster
    0
  • This topic is deleted!

    | ASOS
    0

  • The site you've pointed to uses ajax to load its content. When the page loads there's a javascript snippet which takes over and adds the # to the page (hence why you're not seeing it as a httpd header). If you click on any other link you'll see that the base URL stays the same with some extra parameters on the end. There are potential crawling issues with this and a number of fixes (some Google documentation here, although this isn't the fix that the site in question is using: http://code.google.com/intl/en-US/web/ajaxcrawling/). So, in short, there's nothing fishy going on - it's just good old ajax content loading Matt

    | mattbeswick
    0

  • Hi, Usin; User-agent: * Disallow: /folder/subfolder is fine, however if you have information stored in your website that you certainly want crawled make sure it is in your site map and use ... User-agent: * allow: /folder/subfolder adding a no follow attribute to all of your pages wont be practical, if a spam crawler ignores the robots.txt it will ignore your no follow attribute. If anything new occurs with robots.txt check large website's robots.txt as they always update to new trends i.e www.google.com/robots.txt Hope this helps:)

    | portalseo
    0

  • Got it. Thanks for both the good responses.

    | mosaicpro
    0

  • Looking forward, if you plan on keeping your old domain well into the future, search engines do take into account the domain expiration date. This indicates to search engines how permanent the site is, but I can't say how much this will affect your ranking.

    | kwoolf
    0
  • This topic is deleted!

    0

  • There are several ways to measure the success of the content and these can be divided into two categories - quality and quantity metrics Quantity metrics visitor engagement metrics (time spent on page, bounce rate, page view per visit) conversion metrics (orders / conversion driven by the content) social metrics (facebook likes/shares, tweets and other share metrics) of subscribers to rss feed (if the content is a part of blog) of comments Rank of keywords driving organic traffic to the page Quality metrics quality of comments quality of social shares Feedback received from users LDA score (although LDA is a quantity metric it fits the quality criteria because the LDA algorithm essentially is a mathematical indication of how relevant the content is to a keyword) Best Sameer

    | ninjamarketer
    0

  • Thank you for your update! I agree with your tests and your solution is probably the best also from a user experience point of view: giving a nice description and useful links on category pages should improve the number of pageviews / time spent on site.

    | MauroMazzocchini
    0

  • http://www.google.com/support/webmasters/bin/answer.py?answer=83105 Check this -> Use the Change of Address tool in Webmaster Tools to notify Google of your site's move. (Note: To use the Change of Address tool, you must be a verified owner of both the new and the old sites.)

    | mosaicpro
    0

  • I went through both sites and noticed that you don't have canonicals on any of your pages. Also don't see any XML sitemaps. If you can only show the businesses that are in that particular area, and remove the ones that are not I would  probably go for that, unless you are doing Local Search as well. I would be curious to see the example of canonical issue you mentioned. Are these Joomla sites? I'm just asking because there are no description tags on anything except the home page.

    | sferrino
    0
  • This topic is deleted!

    0

  • lol...  Alan, you really surprised me with this answer! Maybe CRS should be added to all of the SEO glossaries?

    | EGOL
    0