Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.

  • This topic is deleted!

    0

  • The best resource for XML sitemap information is http://www.sitemaps.org/protocol.php The order makes no difference at all. If you have proper site navigation, the sitemap really does not perform any function. A search engine will scan the sitemap and determine if it has any URLs which have not been crawled, then decide upon what action to take. The URLs already in their database are disregarded unless they have new information (i.e. the last mod date is newer then when the page was last crawled).

    | RyanKent
    0
  • This topic is deleted!

    | Deluxe
    0
  • This topic is deleted!

    0

  • To be honest with you Eugene, it was watching Wil's video that spurred this question, Wil does give a solution in one way, but i feel that, his method just wouldnt work for what we do, and we have already actually written all articles for our categories, so its just knowing where to turn next?

    | frank-244375
    0

  • Hi Sonja, It sounds like you are referring to issues with pagination within your site. The rel=next and rel=previous tags were recently made available to address this situation. Here is an explanation of how to implement the rel=next & rel=previous tags from the Google Webmaster Central Blog. Hope that helps, Sha

    | ShaMenz
    0

  • Oh, gotcha. Glad I could help.

    | WilliamBay
    0

  • April, my guess is that it's related to oscommerce or another login/ecommerce/trial purchase module. The oscsid parameter is from oscommerce. Someone logs in, then they get variables appended to the URLs so that they're tracked through the system. That's one place to tell your developer to look.

    | KeriMorgret
    0
  • This topic is deleted!

    0

  • I had this problem, too. Here's what I did. Under Manage Keywords add the keyword/term to your keywords in your campaign. Then go to the On-Page Report Card and choose that keyword, but input the URL of the page you want to track. The most important step is to click Run This Report Weekly. It is on the right just above the the grade box. You can add or delete Keywords and change which URLs you want have the weekly report for. Hope this helps. Harriet PS I should add that I don't know how the terms are originally chosen by the app, but with the ability to change them is what is important.

    | zharriet
    0

  • I did read all, and my comments summed up my thoughts. You are being deceptive and trying to justify it by saying you are cleaning up the web. When Ryan rightly said that presenting one think to search engines and another to everyone else, you dishonestly put words in his mouth with some mumblings about a third party website owner, you don’t see able to cope with what he did say and instead pretended he stated something else. How do you reconcile these 2 statements? Everything but the tag remains unaltered, I provide EXACTLY THE SAME information on a page to both users and SE Robots Are you presenting the same, or are you altering the tag? And what is this decoy “I am neither hiding my affiliate links nor cloaking ads to improve CTR.” That’s not what he said. He was talking of your no-follow trick you asked about; again you seem to be un-able to cope with the facts. Putting words into someone’s mouth is dishonest and childish. Is this something you have done all your life? Next time you want to make stupid comments read the guidelines first.

    | AlanMosley
    0

  • Assuming that you fear that Google might see this as potential DC: don't worry. I know plenty of sites that have this problem due to their CMS. Google can almost certainly figure that these are the same pages. domain.tld/ domain.tld/index.html domain.tld/index.cfm etc. What you might want to do is to use rel canonical to point index.cfm to / (or vice versa, i would use / though.)

    | Sebes
    0

  • From the standpoint of Bing not indexing all pages my first guess would be poor internal linking and then insure you have a good site map. One of the biggest errors I see when we have a problem is that we have not paid close enough attention to our internal linking. Hope it helps. Had to go look again. From what I see the only link to Conservatories Llanelli is in the footer. This may be where your problem is as it appears the others all have other links. Try putting a link from one of the other areas on the page and see what affect it has.

    | RobertFisher
    0

  • Here is what I posted last time this came up: I would get those re-written to dashes for a few different reasons: Dashes are known as the best way to separate terms in URLS IMO dashes are easier to read by the user, and easier to remember thus more user friendly (just my opinion though no data behind this) Plus signs are often used in URL encoding or in query strings, both of which are not great for users, and both of which have been thought of in the past to look somewhat "off putting" to search engines compared to dashes. So while it might be a pain I would say go to dashes. Also here is link someone posted last time about an article Rands wrote in 2006: http://www.seomoz.org/blog/11-bes...

    | SL_SEM
    0

  • You sure did, thanks!

    | svdg
    0

  • hmmm.... that sounds kind of sneaky... ...Not an answer to your question but... maybe Owner B should just get a new domain and 301  his URLs over?

    | EGOL
    0
  • This topic is deleted!

    | hfranz
    0

  • i found the ref to how to use the url attribute http://schema.org/docs/gs.html#schemaorg_expected see "Using the url property"

    | AlanMosley
    0

  • In theory a change from tables to divs should not affect anything if its simply a layout/styling change. But if you inadvertently changed something else, thats where you might run into trouble.  I would just double check everything else is as before; indexation, URLs, internal anchor text, titles, descriptions etc. Three things I would do; 1. You have it on a test server now?  Crawl the live site and new site with Screaming Frog or Xenu Link sleuth and see if both crawls return the same thing for titles, descriptions, headers etc. 2. View both the new and old site with CSS/Java turned off and set user-agent to Google Bot (there's a plug in for firefox to do this) and see if you can see all the content. 3. Finally, you can run the broken link checker plug in for Chrome on the new site, and make sure all your links work. After launch of the new site, I would immediately also check webmaster tools for any errors. -Dan

    | evolvingSEO
    0