Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.

  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • They should be indexable on the same page. Reviews are very important to improve your rankings, and by creating two pages, you're essentially splitting your link juice. Plus, if your product information goes on those review pages, Google might look at it like duplicate content and consider both pages the same thing. I think it's always a bad idea to create separate pages for bots and people. If you must create two pages, you can use the canonical tag (if both pages populate the same info, minus reviews) to help clarify things. But your best choice for SEO is to have one page.

    | kennyrowe
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • Hi, As the others have said, your backlink profile looks pretty nasty. Another thing to remember is your anchor text. A brief look through OSE shows you have 100% keyword rich anchor text. Natural links often use the domain as anchor text so try to get some good quality, relevant links using www.playhouses.co.uk or http://www.playhouses.co.uk.

    | Confetti_Wedding
    0
  • This topic is deleted!

    0

  • Socialdude, that may well be the case. It also looks more natural if you have keyword links going to internal pages. Typically you'd expect keywords for a home page to be the company's name or their main niche with more specific keyword links targetting internal pages. It sounds like your keywords are working for your homepage at the moment, so you may want to continue with that. But I would add more diversity with internal pages at the same time.

    | PeterAlexLeigh
    0

  • Ah ok, sorry I got the wrong end of that. They will be using rich snippets, a markup such as shema.org - http://schema.org/Review or http://schema.org/AggregateRating should achieve the result

    | RikkiD22
    0

  • I have thse recomendations: 1. Specify your Doctype to help Google see what your content type is. For mobile Google supports XHTML Basic, XHTML Mobile Profile, XHTML, WML, cHTML (iMode), and EZweb (KDDI). We run HTML 5, CSS 3 and jQuery successfully but these are not specified by google, so i would recommend to be safe and go with what Google supports. 2. Add a specialized XML Sitemap for Mobile. It is basically the same as the usual Sitemap apart from adding these to elements: xmlns:mobile="http://www.google.com/schemas/sitemap-mobile/1.0 and mobile:mobileRead more here: http://www.google.com/support/webmasters/bin/answer.py?answer=34648</mobile:mobile> P.S. Name it mobilesitemap.xml. 3. Use WURFL to redirect PC users finding mobi content in the www index back to the PC version. That should help.

    | ClassifiedsKing
    0

  • Correct. Often a site will refer to numerous CSS files. There are tools which will combine multiple CSS files into a single file and properly compress the files to optimize them for page speed.

    | RyanKent
    0

  • Thanks Jeff I've been resisting the Yoast solution for the same point you raised, but it does seem to be the best solution.  Will download it into a test blog and try it out. Thanks for you help Catherine

    | catherine-279388
    0

  • We did not get more details on that - I assume it may raise some flags if you suddenly redirect 20 domains. I'm not sure if or why is that, that's why I decided to ask a follow-up question. Btw, thank you both for your replies!

    | propertyshark
    0

  • It can be done.  Make sure your development team reads through http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content and http://code.google.com/web/ajaxcrawling/ before coming to any conclusions.  If you have an AJAX based website, you'll have a little extra work to do depending on the route you take.

    | john4math
    0
  • This topic is deleted!

    0

  • Does your blog actually contain duplicate content from your website or are you saying that the Moz Crawl is identifying some of your blog pages as being duplicates of other pages inside the blog? It happens to one of our sites due to how the tagging works, but I've never really thought anything of it. Is this the seo tool you're talking about? http://wordpress.org/extend/plugins/canonical/ Also have you read this article http://www.seoresearcher.com/how-to-make-your-wordpress-blog-duplicate-content-safe.htm

    | PeterAlexLeigh
    0

  • If you want to rank for this content you need to display it statically. AFAIK there is no way to instruct googlebot to try different variables to bring up different dynamic content.

    | AdoptionHelp
    0

  • You should read this http://www.google.com/support/webmasters/bin/answer.py?answer=139066#301 Google seem to prefer 301 re-directs where possible, but in your case it may be much more straight forward to use the canonical tag. It was created pretty much for times where you have variations on a product (such as different colour shoes but otherwise identical, each with a separate page)

    | PeterAlexLeigh
    0