Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Many of the big affiliate and price comparison websites route links through an internal folder first. This allows them to record the conversion, set a cookie, etc... It also makes all of those outbound links look like internal links. Usually it looks somethingl ike this: Web Page hyperlink for "Visit OnlineStore.com" is actually a link to YourDomain.com/partners?merchant=OnlineStore. From there it will redirect (what kind doesn't really matter here) to OnlineStore.com along with any affiliate code you have added to the URL for their tracking purposes, if necessary. Typically /partners is blocked in the Robots.txt file so the search engine can't even follow it to see that it's an external link.

    | Everett
    0

  • Hi Paul, When looking to see if a redirect is the right thing to do, I  would do a little more digging into the old domain first. Check to make sure there there was no penalty or bad links. Did you purchase this with a view to trying to use the old URL to boost your French Polishing page, or were you going to create a new site? I wouldn't be rushing to redirect this without knowing a lot more about the history first of all, otherwise you might be causing yourself problems. -Andy

    | Andy.Drinkwater
    0

  • Hi Marc, Where does Joost say this about comments hurting SEO? I've been trying to find it online.

    | tgerencer
    0

  • We all agree that you should get rid of those 302's . Then, Im kind of getting the idea... So, you're saying the possible solution: Turn the category pages into landing pages. It's logical to me too. Of course making that move, will need well organization and a bit of preparation, so as nothing gets lost.

    | GastonRiera
    0

  • I have to say, GTAMP, I agree with all the above advice. As Jared notes, this isn't something that's going to wreck your traffic on its own. However, David is spot-on that it's certainly worth fixing if the cost isn't too high.

    | MattRoney
    0

  • The naked domain gives an extra room for your permalink to be more visible on SERPs thus making some extra keywords visible instead of being chopped off. So it has an SEO benefit for people who are habitual with creating long-tailed keywords inside permalinks. If you think shorter is better, then the non-WWW approach will make you feel better. There are apparently no “technical” benefits to using the non-WWW approach, only that it reduces redundancy and is no longer needed. To be practical, it makes no drastic change whether to use a naked domain or domain with www as the sub-domain. A domain which is hosted at the root directory like www, is better remembered by people as a brand. People are habitual to recognizing websites with www extension and are observed browsing sites in address bar using the www extension most frequently. The following advanatages you may get from it Ability to restrict cookies when using multiple subdomains. Using the WWW hostname allows for easy segregation in the file tructure of your website. More flexibility with DNS. However, my preferrence is always to use WWW You can ready further guide on this page: http://www.hyperarts.com/blog/www-vs-non-www-for-your-canonical-domain-url-which-is-best-and-why/

    | Mustansar
    0

  • Is this happening to all of your titles or just some of them? To me it sounds like you've somehow setup WordPress (or a plugin) to add a prefix to all your titles. Which then forces in "XYZ Company |" before every title you've made. If that's the case you should be able to find it somewhere and change it back.

    | LSIversen
    0

  • Hi, we are using www.9.digital and its ranking well in SERPs for related keywords. So i don't think that preference is always given to .com or .net domains. Thanks! ghGwmlD

    | 9digital
    0

  • Just FYI, I think you will find the robots.txt is fine Just to test this, I used a couple of the online testing tools to confirm: http://technicalseo.com/seo-tools/robots-txt/ http://tools.seobook.com/robots-txt/analyzer/ For peace of mind, I would check this in Search Console and use the Robots.txt tested in there also. -Andy

    | Andy.Drinkwater
    0

  • Hi Gianluca, So if i want to create a campaing for the spanish version of the site, i should add to the site www.ourdomain.com/es-es to crawl and track it? Thanks, Carlos

    | carlostinca
    0

  • Thanks for the response. But presumably when Moz show me search visibility scores this takes into account mobile as well? So maybe our landing page not being particularly optimised for mobile is one factor why our visibility has fallen?

    | KerryK
    1

  • Hi, Basically, you can't have 2 page competing for the same keywords - this will cause you issues in Google and will end up with just one or the other being ranked. There really isn't a way around this. You could have the new page as a link off the existing page though and promote it that way as an "or how about this" feature? -Andy

    | Andy.Drinkwater
    0

  • Ahh I see. Sorry then, this isn't something I have actually tried for myself or any clients so don't have any data to give you that would be useful I'm afraid. -Andy

    | Andy.Drinkwater
    0

  • Moz's index isn't a complete view of the web, but we try to crawl quality sites out there. Our index is always growing. But to be frank, no one -- not even Google -- crawls the entire web. There are just too many pages out there and computing power is expensive. Google crawls the most links simply because Google is a bigger company with more money and more servers and more engineers and who's business depends on serving web pages to everyone. At Moz, we focus on quality sites because we believe (and hope) our customers and community are focusing on high quality links that are still active, which are going to give them the best SEO results. (Also a huge part of the web is just spam nonsense.) Many SEOs -- and I include myself here -- will use multiple sources of link indexing for the broadest view of the web and their sites. I use Moz's OSE and Google Webmaster Tools (free and big, but includes lots of spam, which can be great if you're fighting a Penguin penalty). Others will use OSE, GWT, and OSE competitors like Majestic and Ahrefs, especially if link building is huge for them and they want to account for every penny. Every index is a bit different.

    | EricaMcGillivray
    0

  • There is long article on the dev blog how they determine whether pages are duplicates - check https://moz.com/devblog/near-duplicate-detection/ - it's quite technical stuff - but this is the part which might interest you: "This leads to one of the questions we get asked a lot: Why do I see duplicate content warnings in the context of Custom Crawl for pages that I see as different. Ultimately, it’s always because of the same reason: because no dechroming is done, there is a small amount of unique content relative to the total content. One of the places where this crops up a lot is web stores, where there’s a large amount of chrome layout, but only a short product description associated with it." Dechroming : removing things like navigation, footer, ..etc from the page (exact def. to be found in the article) If you compare both pages - apart from the image & product title there isn't too much difference between them so the crawler sees only a very small % of content which is different and marks them as duplicates. Dirk

    | DirkC
    0

  • Great, thank you both! Good idea Casey. I tried to make it super hard on myself and looked up a markup from BestBuy, I'll try a competitor.

    | localwork
    1

  • As per Mark, if you are only planning on using one sole language then I would not bother creating a sub directory for your site to reside in when it would sit best in the root. If however, eventually you plan to branch out again and become a more global operator I would keep your master site in the root and then branch out for supplementary languages e.g .com/fr or .com/de depending upon what you have planned for the future. Again as per Mark, sub folders are my preferred option rather than sub domains, but both have their pros and cons. If you do indeed look to branch out also look into serving alternate language set up and definitely check out this article from Search Engine Land on multilingual sites. a great resource.

    | TimHolmes
    0

  • I hear ya there. I have sites on Pair.com (really good) and I'm getting ready to move a site to Liquid Web. Both have exceptional support and reliability and "scale" well. I'd start looking at those specifically. Hope that's helpful Charles. Good luck!

    | mediawyse
    0