Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • You're on the right track as far as showing users the PC versions of pages without an equivalent instead of the 404's: "If your content is not available in a smartphone-friendly format, serve the desktop page instead." -http://googlewebmastercentral.blogspot.ca/2013/06/changes-in-rankings-of-smartphone_11.html However, you should let Googlebot (mobile or not) crawl your site the same way as your users would, which means you shouldn't block googlebot-mobile. It is totally fine for a desktop page to be ranked in the mobile index. It's not ideal (see article above) but it's better than not being there at all. Think about it this way: if you're OK with your mobile users to see non-mobile-optimised pages, then it doesn't matter whether they come from Google or not. The page is still relevent to them. As an extra, you could consider making some of the non-optimised pages responsive such as to make the user experience a bit better. Sometimes a few tweaks here and there can make a difference.

    | AxialDev
    0

  • Honestly...  I can't say that I would tell a friend that he should hut toads.

    | EGOL
    0

  • [Quote]Ahh, I read it as ..."then 301 redirect to the old URL."[/Quote] Hah!  I did the same thing first.  Then I had to rewrite my response.

    | Kurt_Steinbrueck
    0

  • thank you for this, will do this now, many thanks for the great advice

    | ClaireH-184886
    0

  • Lisa, use the plugin @Saijo George recommended: http://wordpress.org/plugins/limit-a-post-title-to-x-characters/ It will do the trick automatically for all pages and it doesn't require any programming (PHP or MySQL) skills.

    | FedeEinhorn
    0

  • It's a bit tricky - we actually count something like 90%+ duplicated content as "duplicate", so we may be giving you false alarms on this particular example. My gut reaction is that they're thin pages - they look fine on the surface, but there's very little text content for Google to parse, and I think the overall content is very light even from a usability perspective. From a pure SEO perspective, if you had a lot of these very similar pages, you could run into some trouble. Honestly, at this point, you just don't have the authority (link profile, etc.) to support 10,000 products and the roughly 18,000 pages Google has indexed on your site. In the extreme case, you could run into a Panda penalty, but overall it's just an issue of dilution. Basically, you don't have the ranking power to support that many products, especially if Google perceived the content as thin. It's a balancing act, but I'd consider potentially NONINDEX'ing some of your thinner product offerings while you build up unique content for at least your top sellers. This doesn't have to be all-or-none. It may be that a couple hundred or even a few dozen products account for 90% of your sales, so start with those. Meanwhile, de-index some of your weakest content, and let the rest build up over time. Of course, there may be other issues at play, like actualy URL-based duplicates, that could be tackled before you start removing products from the Google index. Again, it's a balancing act. You could also use rel=canonical, but my gut reaction is that it's borderline for the cases you're showing here. These aren't true duplicates, and are separate products. It's just a matter of whether you want Google to see them yet or not.

    | Dr-Pete
    0

  • Hi Becky, has your issue been resolved?

    | Christy-Correll
    0

  • Hi PureMobile, you've received some great responses. Has your question been answered?

    | Christy-Correll
    0

  • Hi all, this might be a stupid question, but what is the "conventional way" of building links? Thanks

    | cwsinc
    0

  • Hi Social Engaged, I believe this has been covered a while back, here is the link! I also believe it shouldn't be much of an issue as Google will be able to distinguish that they are social profiles and are supposed to be kept consistent.

    | Jonathan_Hatton
    0

  • Gotta agree with you on this. Having a "blanket" rule to render all links nofollow will certainly help to discourage spammers, but may not help in encouraging contributors. Everybody loves a little incentive here and there. Many popular forums, including Moz Q&A, have adopted structures to remove "nofollow" tag in links or include more links for top quality contributors. You might want to consider adopting such a structure for your forum.

    | ReferralCandy
    0

  • Legitimate comparative use should be fine. So if you wrote a blog post about how your service compares to two or three others in your industry, or had a page on your site about the common services offered by your competitors and how you stacked up in comparison to show you have more or better priced services, those would likely be considered fair use. Just make sure you're not saying anything blatantly deceptive or slanderous.

    | MikeRoberts
    0

  • Do you work to improve website page speed performance? If not, could you recommend any person or company that can get the job done? Thanks, -Mike

    | naturalsociety
    0

  • You should look into the module by Presto-Changeo, duplicate url redirect. It fixes this issue.

    | LesleyPaone
    0

  • Hi Laurence Google tries to take the iFrame as associate it with the page it's on (in effect trying to view it as a single page) - so in their ideal world, you should look at the page and the iFrame all as one page and treat it accordingly. In reality though, they don't always accomplish what they want, so you might be OK. What I would do is check your cache and text only cache and see if they're caching everything all as one page. Here's their documentation on iFrames  -Dan

    | evolvingSEO
    0

  • As Chris says, this shouldn't be a problem at all. We've done this on a few stores and not seen any measurable impact in either direction from it. The only thing to be aware of is the possibility of introducing canonical URLs. http://example.com and https://example.com are different URLs. If both are accessible and return a valid header then both can be indexed.  Always worth ensuring that you either have a redirect, a rel=canonical or robots.txt addressing that issue if you have https in place.

    | matbennett
    1

  • Correct, it is a pita to do, but if you do things correctly and have an array that you can draw meta information from, it will get indexed. That is what we all want, instead of making a category of "red ipods" you can just have the search for "red pods" indexed, while disallowing anything from ?search=other term

    | LesleyPaone
    0

  • If you have a 301, you use the 301, done, mission accomplished.  Google should drop the original page and start to use the new page in its place in the SERPs.  This is also automatic for the user as they are moved from one page to the other.  One thing, you want to make sure that the page that you are sending people to is semantically related to the page that they were sent from, otherwise you risk losing rank in the SERPs. If you use the 301, there is no original "page" that you can put the canonical or a noindex. If you could not 301, you would want to only use the canonical.  Google usually will treat a canonical like a 301. If you use a canonical, you should not have to use the noindex.

    | CleverPhD
    0