Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Yeah thats whats going on. Its using something called SiteCore. Looks like theres an issue on their end, but tough to troubleshoot from my perspective. Thanks.

    | jim_shook
    0

  • Thx. I am going to make non unique pages "noindex, nofollow" and I am going to get rid of "rel=next prev". This keeping "follow" on noindex pages is so minor and I think might hurt my site since it allows Google to read what is on those pages....(non unique duplicate looking content). I will update in a few months with the results.

    | khi5
    0

  • Keep in mind that if it is a link that is paid for in some manner, the search engines want to see a nofollow on that link.

    | KeriMorgret
    0

  • Hi In my opinion the logo should not contain H1 or H2 tags, if it does not contain indexable text such as slogan, byline etc. H1 should be used for the most important heading comparable to how its used in a word document. This should also be your main heading. Also there should be only one H1 on a page. H1 should be used for the second most important heading comparable to how its used as a subheading in a word document. This should also be your main heading. There could then be multiple H2s on a page since they describe sub content. You can find more useful information about this and other SEO challenges in the great Moz Guide. Hope this answers you question. Have a great day Fredrik

    | Resultify
    0

  • Hi there! You've received some good advice so far. Did you see David-Kley's question about what CMS you are using? Please let us know so we can help you sort this out! Thanks. (Christy)

    | Christy-Correll
    0

  • **I'd like to think not ALL duplicate content is bad. ** I agree. However, more often than not, it is bad. It's best to approach each situation with caution. As long as your site has been crawled and that page is already in the SERPs, and the new website puts a link to the original article on your website, I don't think there should be any issue. I don't feel that confident in this statement. In theory, yes the original article would get the attribution. However, in practice, that can not be the case. If the site posting the duplicate content has higher authority and traffic, they may get the attribution even though you published the content first. Google does a decent job in figuring this out, but it's an area they need to drastically improve upon. I personally would re-work a new article or ask them to summarize your article, to be on the extra safe side. But that's just me. Summaries are a good idea. Still, you can't beat new unique content. Use the article for inspiration and create a new article that takes a point the original article made even further. Definitely include a link back to the original article, if possible. This strategy would give you a better gain the simply posting or summarizing the original article with a link back.

    | Ray-pp
    1

  • The easiest way to resolve issues with tags is to noindex them. I wrote a post about how you can safely do this: http://www.evolvingseo.com/2012/08/10/clean-sweep-yo-tag-archives-now (you basically just double check to see if they are receiving traffic, and leave the few that receive traffic via search indexed). But at the root level it comes down to knowing how to use tags correctly on a blogging platform to begin with - and knowing how they function, and what happens when you tag something. First off, tagging any post creates a new page called a "tag archive". The only way someone can get to tag archives by default is if you allow some sort of navigation or links to them on the site itself. This is usually in the form of a "tag cloud" (sidebar or footer) or at the bottom of posts when it says "tagged in....." and links to the tags. Then if they are internally linked to, they will get indexed (unless you noindex them like I have suggested above). They are typically low to no-value pages because most bloggers just tag everything, and use lots of tags per post. Then you end up with hundreds of pages (tag archives) with no value. So noindexing them is the safest way to go, except for very extreme cases where a blogger uses them 100% perfect (which is rare, so I always assume most people asking should just noindex but use my post to check for traffic to any of them first).

    | evolvingSEO
    0

  • One person's 'mostly unique' and another person's 'duplicate' are two different things. Without knowing the domain(s), it felt like a good idea to cover that angle - just in case. Though I would be inclined to agree in regard to unrelated links in many instances.

    | Travis_Bailey
    0

  • You can't set clientsite.com as the canonical home page, because it's not, FPD. The very definition of "canonical" means it is the actual URL that displays the page content. A URL cannot be canonical if it redirects, is no-indexed, etc. In your case, using clientsite.com as the canonical would badly damage your ranking capability, because you'd essentially be telling the search engines to rank a page that has no content and essentially doesn't even exist. Clientsite.com/store is the canonical homepage for your site, and should be set as such. The only way to change this would be to use a system that doesn't require the redirecting of the domain to the subdirectory. (And if you're sticking with the current system, make certain that the redirect being used is a 301, not a 302.) Hope that helps? Paul

    | ThompsonPaul
    0

  • Adding more... The buttons from some shopping carts will work in .pdf documents.  So if you write one with a parts list, you can place a buy button beside each part to make it really easy for the person to purchase. Also, type your domain name in the pdf.  That way, if people print it and want to go back to your website you might get a navigation query.   Typing the URL where it can be found might do the same thing.

    | EGOL
    1

  • All links in Q&A are nofollowed, but we still ask that you not use unnecessary/unrelated links in your answers. Thanks!

    | KeriMorgret
    0

  • Thanks for the update.  Glad you got it figured out.

    | EGOL
    0

  • I would imagine that the best solution would be to point the sub cat canonically to the parent this making the parent category page the dominant authority page.

    | TimHolmes
    0

  • Fergclaw, It may boil down to the quality of those slim back links you mentioned.  You call them relevant, but you don't attest to their quality. I think if you can honestly say that the few links you have are quality, editorial links, then redirecting those domains (not necessary the specific urls (for appearances sake)) to appropriate subdirectories on the new site could be a step towards keeping the whole thing under the filter threshold. As an additional appearances-sake step, you may choose not to redirect any of the other non-linked-to domians--I mean, what value is there in redirecting them, anyway?

    | Chris.Menke
    0

  • Thanks Everett, Rel="canonical" is in place, so that's covered The urls with the parameter are only accessible if you want to directly access a particular size. If you are on the default page and switch sizes from the dropdown, no URL change is presented. I have left webmaster to decide what should be crawled or not. The parameter has been mentioned though.

    | Bio-RadAbs
    0

  • I don't think that can be had for the price. I pay around $260 for a single xeon with a ssd and 8 gig.

    | LesleyPaone
    0

  • A 301 is the way to go then. This helps Google understand how pages have moved. -Andy

    | Andy.Drinkwater
    0

  • Celts, Did you ever resolve this? What you were discussing back in 2012 is called a "hashbang", and you can learn more about it here on Google. It is technically a way to get AJAX-loaded pages indexed on their own URL. You asked this question a couple of years ago, and things have changed since then with push states and HTML 5 being preferred over hashbangs, and not loading a page's content with AJAX still the recommendation when possible.

    | belasco
    0