Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.

  • This topic is deleted!

    0

  • Thanks, We have webmasters account set up - we knew it from even analytics that search terms that we were ranking before - now we are not.

    | seoidea
    0

  • Thanks but as I mentioned above I am in Germany so cannott use analytics at all, hence my need to find another way to do this

    | soeren.hofmayer
    0

  • Ok, got it.  Thanks for your help!

    | BradBorst
    0
  • This topic is deleted!

    0

  • John - Thanks, i'll start here. I'm not sure why they are set up like this (facepalm)

    | Hyrule
    0

  • Thats why I ask on SEOmoz, a sensible answer with sensible advice that can be easily put into action. This had crossed my mind, however I've been staring at the woods so long, I couldn't see the trees! Thanks EGOL.

    | Entrusteddev
    0

  • What I can recall from Matt Cutts is that Google is pretty much ignoring all meta tags besides the 'description' one. Given the fact that I've never even heard of this one, I think it is pretty save to assume it is of no value (atleast to search engines).

    | Theo-NL
    0

  • You are most welcome. Please let me know if it works.

    | RyanKent
    0

  • Yes From the Matt Cutts / Eric Enge Interview Eric Enge: Can a NoIndex page accumulate PageRank? Matt Cutts: A NoIndex page can accumulate PageRank, because the links are still followed outwards from a NoIndex page. Eric Enge: So, it can accumulate and pass PageRank. Matt Cutts: Right, and it will still accumulate PageRank, but it won't be showing in our Index. So, I wouldn't make a NoIndex page that itself is a dead end. You can make a NoIndex page that has links to lots of other pages. For example you might want to have a master Sitemap page and for whatever reason NoIndex that, but then have links to all your sub Sitemaps.

    | RyanKent
    0

  • Even though the recipes are the same, I doubt they'd be penalized for having duplicate content.  If the search engine decides that it's duplicate content, it may filter out one or the other, and may knock one page down in the search results.  If your first page of results was all the same page from different domains, the search engine would have failed you.  The duplicate content penalties you're thinking about are more for spammers and spam content, where they're duplicating a lot of content, and doing it intentionally. In your example, there is a lot else on the page other than the ingredients and the recipes, so that also should mitigate some of the duplication, and they're also laid out a bit differently in the HTML between the two pages, one with divs and one with tables. Sources: [Google Webmaster Central Blog '08](Google Webmaster Central Blog '08 ) High Rankings Advisor (Jill Whalen) '10

    | john4math
    0

  • Will using http ping, lastmod increase our indexation with Google? No. You can submit a perfect sitemap and ping Google with changes every hour, but that will not increase the number of pages which are indexed. A few good sources discussing sitemaps and indexing: http://followmattcutts.com/2010/03/23/matt-cutts-on-sitemap-indexing/ http://faq.bloggertipsandtricks.com/2010/08/html-xml-sitemap-what-difference-matt.html If you have a site with solid navigation, good architecture and links, then there is no need to use a sitemap. Search engines will determine how often your site should be crawled based on your site's authority. They can also determine which pages have been modified by comparing the header dates with their database. I still use a sitemap, but it's mostly because the process is fully automated. I know of other sites that are well indexed which do not use site maps at all. With the above understood, I'll try to offer a bit more information directly related to your questions. When you ask about pinging, I presume you are referring to mainly Google and Bing. For those cases, the answers to all four of your questions is NO. Listing your sitemap location in robots.txt will help other search engines whom you did not ping to locate your sitemap. This can include the SEOmoz crawler, for example.

    | RyanKent
    0

  • Haha brilliant! I'm totally with you on that. And since Matt doesn't tend to divulge much (and half of what he does is cryptic) that would put Rand as source number one, or I should say Rand & co... all the staff and associates, etc... on here are pretty much a fountain of knowledge. I'd be screwed if I  didn't have SEOmoz to learn things from.

    | SteveOllington
    1

  • The tag is to prevent Microsoft Smart Tags from rendering.  It was only included in beta versions of IE6, and no other browsers since.  It's not worth including; you can safely delete these tags. An explanation of smart tags on wikipedia. The same question on stackoverflow.

    | john4math
    0
  • This topic is deleted!

    | caneja
    0

  • I agree with Steven on everything he said. Google has been putting a lot of emphasis on making sure SEO's are utilizing GWT as a resource for usability and correct data.  I'd use that as your primary source.

    | malachiii
    0

  • Ensure you have unique and contextually relevant content ( text, images, video, audio) exposed on the dynamic page. As I understand it the engines are looking for content which has a valuable user experience. In the case of Google and Bing they are using their respective toolbars (and analytics services) to measure user engagement signals. In this case, quality and maintaining rank for pages is reliant on both the content and user acceptance of your content. Ask yourself a simply question, will this page satisfy any user who comes to it? I try and have a question and answer for each page on my site, if the page content does not answer the question then I think again about how I can best deal with solving that.

    | Damien-Anderson
    0

  • As Ryan said, do 301's. You don't want to have the old versions up and canonical them if you don't have to. Imagine the confusing surfers that land on something because of an old link, and then click around and end up on the new version.

    | sferrino
    0
  • This topic is deleted!

    0