Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.

  • This topic is deleted!

    0

  • ahhh, duh! Dr. Pete shed light on what we should be thinking about here. You're not getting messages for sending out too much PR but for too many links. He's right; nofollow will not remove them from being counted. Nofollow stops PR from being passed. Link equity is a broader concept than PageRank. Link equity considers relevance, authority and trust, link placement, accessibility, any value of relevant outbound links, etc.  It sounds as if you need to focus more on how you implement the links on your site. If you need to reduce links, as mentioned earlier, use AJAX as an external file if those links are needed on the page. If they don't offer any value, then remove them. I viewed your page earlier but cannot access it now. They didn't appear to help the user experience anyway. Often what's good for the user is good for Google.

    | DanaLookadoo
    0

  • Before you delete anything, submit a ticket to the help desk.  They may be able to fix it up or point to why it's happening. http://www.seomoz.org/help

    | sprynewmedia
    0

  • Yes, the owner can delete messages, but perhaps you might be able to find something in the indexation history graph or other place.

    | OrionGroup
    0

  • Ok Got it. Thanks a lot for the explanation. S.H

    | sherohass
    0

  • I agree with Andy, it totally depends on the type of data you are marking up structurally.  For example product pages would need different markup to share quantity, price, reviews, etc versus some of your main company data like address, phone, etc. If you share what data you are marking up we can better assist you as to which pages should get code added.

    | jws8118
    0

  • Hi Ben, if it works in GWT but not showing in Google results, it's probably because Google doesn't think that page is important enough to show rich snippets. It is probably eligible, but not mandatory for Google to show.

    | OrionGroup
    1

  • I'll PM you my recommendations - can you mark my answer as helpful above too?

    | Ubique
    0

  • Linking from a client site back to your site isn't a great way to go, even though the pages are related the topical relation is really from your project page to their site. I would have the usual clients/projects section on your site with links to your projects, but any link back from them to you should be nofollow. Having a site-wide footer link is not going to do you many favours anyway, and if it is just from their homepage footer it has less potential negative impact but is still not a real endorsement.

    | karimg
    0

  • Hi, I edited the robots.txt file for my website http://debtfreefrombankruptcy.com yesterday to allow search engines to crawl my site. However, Google isn't recognizing the new file and is still saying that my sitemap is blocked from search. Here is a link to the file itself: http://www.debtfreefrombankruptcy.com/robots.txt The Blocked URLs tester said that the file allows Google to crawl the site, but in actuality it still isn't recognizing the new file. Any advice would be appreciated. Thanks!

    | nextlevelweb
    0

  • Thanks Tom

    | Johnny4B
    0
  • This topic is deleted!

    0

  • How long ago did you switch platforms?  It can take months for Google to come back around to a page that linked to your site.  Page on your site will stay in the cache until a few passes. When you switch, did you do any 301 redirects?  Examine the back links to your domain - any that come from good pages should be redirected to the new URL.  If not, they will be scooped up by active SEOs. (finding 404 links is a popular link building technique). http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93633 If you know the links will be dead forever, try using a 410 response as it is supposed to make search engines drop the page faster. http://www.seroundtable.com/404-410-google-15225.html (bottom) Have you requested Google remove old directories/pages?  If the content is gone and has no back links, try a removal request. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427

    | sprynewmedia
    0

  • We don't have a view all page(We found them so slow, so long, and with so meny links we had a notable improvement in rankings in general when switching to the quicker paginated versions). And other then the first page none of the other pages are currently in our site map. I'm not entirely sure how that would stop gwt flagging it as a duplicate meta though. Less you imply to also no-index them.

    | My-Favourite-Holiday-Cottages
    0

  • Perhaps laziness. Perhaps it's just someone who came up with a good solution that worked for themselves and then wanted to make money off of selling it to others

    | OrionGroup
    0

  • That's quite hard to tell without knowing more about the website, keywords etc. Will you target the US with the same website or will you launch a new one? After you've decided on which keywords you'd like to rank for, start with making an analysis of the keyword difficulty and the competition in the US. Take a look at the metrics of sites that rank well for the keyword you're targeting and have a look at their link profile. Also don't forget to do a little keyword research so you get an idea of the search volumes and can check if users in the US use the same keyword as they do in the UK. This information will give you a better understanding of the market and will allow you to plan and prioritise your SEO efforts better.

    | Solvari
    0

  • This applies to Google Mini or Search Appliance which are custom search tools for an individual website. They allow site owners to sculpt the indexing of their private set ups. Adwords also has something to help indicate the important content for determining the page topic for relating ads. However, they don't apply to Googlebot spidering as mentioned above.

    | sprynewmedia
    0

  • I infer from what your saying that the amount of people with javascript disabled in 2013 is probably less than negligible. This is a very good point. Looks like we're perhaps trying to fix the engine of a car we should have got rid of a while aqo...!

    | LawrenceNeal
    0