Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • The first thing I notice is you do have two different pages http://www.britishhardwoods.co.uk/oak-skirtings-architraves.php http://www.britishhardwoods.co.uk/oak-skirtings-architraves.php/ The page with the "/" on the end is actually a different physical page. Both pages have the identical content presented differently. At a glance I would guess the page with the "/" ending is for users who view your site from a cell phone or with older browsers. Either way you need to canonicalize these pages. Right now those pages will show up as duplicate content to Google. As to your initial post, both versions of the page do contain a proper meta description. Their length is good at around 110 characters. I would check these pages in the crawl report for other issues. Perhaps it is getting confused. The canonical piece is a big concern which should be addressed immediately. Good luck.

    | RyanKent
    0

  • HI Wildner Thanks for the ideas, i was thinking along these lines. Thanks for your input!

    | Turkey
    1

  • I've actually found site-wide links to have detrimental effects when tested. I usually go for in-content links.

    | David_ODonnell
    0

  • Meta robots refers to the < meta name="robots" > tag at the page header level.  This is usually the case when a blog is set up with an SEO program like All In One SEO for example, where you can manually set which content is blocked.  It's common to block archives, tags, and other sections, in the theory that allowing these to be crawled could either cause duplicate content issues, or drain link value from the primary category navigation.

    | AlanBleiweiss
    0

  • Well I'd say that the nonindex issue Gareth pointed out, and the major markability error I discovered are your two biggest concerns initially.  But yeah, once the dust settles, the other issues I point out are all a concern.  Unique content that's specific to each page is quite important to long term SEO success.

    | AlanBleiweiss
    0
  • This topic is deleted!

    0

  • Thanks for the suggestions, Darren - we did discuss this as a possible reason and we plan on testing this theory out.

    | HarborOneBank
    0

  • good video! Definitely going with regular links for this secondary menu

    | Motava
    0
  • This topic is deleted!

    0

  • so my best bet is to make the, nofollow ?

    | DavidKonigsberg
    0
  • This topic is deleted!

    0

  • We ran a test linking a new domain (lets say it was asdfdfhfgj.com) and found out that the second anchor text to the same url bears a lot less weight than the first one. We linked asdfdfhfgj.com from a respected domain/page. Two links, different anchor texts. Both anchor texts where random strings that previously returned no results on Google. For the first months the page only ranked for the first random string, not the second. This is obviously just one data point but it does suggest something about the way Google treats links to identical urls from a single page. However, this effect is far from a penalty and I doubt Google would ever penalize for having two links to an identical url on one page.

    | PanuKuuluvainen
    0

  • The suckerfish menus at htmldog work well.... http://www.htmldog.com/articles/suckerfish/example/ http://www.htmldog.com/articles/suckerfish/dropdowns/example/ http://www.htmldog.com/articles/suckerfish/dropdowns/

    | EGOL
    0

  • Hi Steven, thanks for your fast answer!

    | GregorHendrych
    0

  • Hi again, Keri, Thanks for all your analysis.... I think the pages you found on archive.org are the root of the problem - this was when the domain name was 'parked' at WhyPark... I think that's where Google gave it the original penalty - not fair IMO but at any rate I have confessed my sins in this regard ages ago to no avail. Thanks for the content tips too - I'll fix them up now (there are 5 games currently on the site btw - not sure why you couldn't see them). Thanks agin David

    | OzDave
    0

  • Thank Ryan, Does that mean registering 100's of expired domains with backlinks and 301-ing them all to the main site would be a viable link building strategy?  It shouldn't in theory because the original 'votes' were to the expired site not the new site. But how would Google algorithmically work this out other than just devaluing heavily a 301 redirect from one domain to another.  And if they did that surely the minisiste would be the best option? Thanks again David

    | OzDave
    0

  • Although web hosting location is one factor that the search engines will look at for localisation, they will also look at domain registration contact info, published contact address on your website and other signals that may provide a clue of 'where' you do business. I had a Canadian client a while ago that intended to do business primarily in the US. Despite the fact that their site was hosted on a US machine, the site was suspiciously ranking low for some key phrases despite their good optimization efforts. After further review everything about their business (domain contact info, business address) would lead you to think that they do business in Canada, not the US (They showed up page 1 for Google.ca searches).  It wasn't until we went into Google Webmaster Tools and set the geotargeting to the United States that we saw their key search phrases jump up into page 1 for Google.com searches. My opinion... although server location is a factor, I've seen other factors have a bigger weight.

    | adrianvender1
    0

  • How did the splash page work for you? Any lessons learned that will be helpful for others in the SEOmoz community? Would be great to see a comment about how things went.

    | KeriMorgret
    0

  • This is correct, Rand Fishkin actually did a blog post a few years back stating that it is more valuable to have a blog located in the site's sub directory as apposed to a sub domain. So I would assume that if your creating site that will encourage links to it, it would be more beneficial to create it in the subdirectory so that the popularity will be transferred to the master domain.

    | SEMCLIX
    0

  • I'd just add that if the solution chosen is noindex, to do the noindex, follow method, just to give the extra cue if there are links on those pages.

    | AlanBleiweiss
    0