Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Dirk Thank you for your response it is very helpful.

    | Palmbourne
    0

  • If you have your sitemap xml file(s) set up properly, you can resubmit them each time you update that specific content. If the site is very large, I would suggest having a separate sitemap file just for those review pages within the site, and resubmit that one specifically.  That can help motivate Google to recrawl that content sooner. Also, do you have "last-modified" meta tags set up? That can help as well. Depending on how high the quality of the content is, it can also help to send other signals Update the home page with a link to the newly updated review right in the upper portion of the home page's main content area. Consider a quality, not over-optimized, press release you distribute through a trustworthy release site - where you issue a press release describing the full review and only linking ONCE in the body of the release, directly to that review. Tweet a link to the review page on the day you post the review as well.  Now that Google is integrating Twitter more, that can further help visibility.

    | AlanBleiweiss
    0

  • Hi Peter, The conclusion I have come to is to try to keep page links to around 100 but that said there is a lot of conflicting info out there which is where the confusion comes in. I have read that 100-200 is ok now a days but essentially the more links you have the more the page authority will be diluted so it makes sense to keep links down. On this topic Moz’s on page grader tool says “Google has confirmed that the use of too many internal links on a page will not trigger a penalty, but it can influence the quantity of link juice sent through those links and dilute your page's ability to have search engines crawl, index, and rank link targets. Recommendation: Scale down the number of internal links on your page to fewer than 100, if possible. At a minimum, try to keep navigation and menu links to fewer than 100. See http://moz.com/blog/how-many-links-is-too-many.” In my situation there are some unnecessary and duplicated links so I looking at ways in which these can be reduced. My hope is that the page authority will then be channeled to the more important areas of the site. Regarding location of the links, those in the navigation will probably carry more weight than those in the footer so I would try to include your most important links in the top part of the page.  If you look at Moz’s home page for example the footer links are pointing to pages like Contact us, research tools, parents, api and Terms rather than duplicating the main navigation again. That is my take on the situation anyway. Hope this helps, Andy

    | caravan
    0

  • As others said, rewrites are common, but rewrites that include information that no longer exists is a bit odd. Typically, it means either Google is caching old content somewhere (including a duplicate copy of the page), or they're pulling from a directory somewhere, like the Open Directory. You could try the NOODP tag, since its harmless, although the hit rate is low (i.e. don't get your hopes up): If it were me, I'd put a unique snippet of the undesired meta description in quotes and do some exact searches on Google. Try to find out where Google thinks that text lives.

    | Dr-Pete
    0

  • Hi there, I see what you're trying to do, and I think I understand it. You're attempting to conserve your link equity and flow it only to the most important pages, or what we use to call "pagerank sculpting." The good news is you don't really need to worry about it. These days, adding nofollow to your links doesn't really increase the equity flowing through the followed links. And in fact, you could be shooting yourself in the proverbial foot by denying equity passing links to your lower product pages. Best time to use nofollow for internal pages is typically to increase crawling efficiency, or to prevent bots from visiting pages you don't want indexed anyway. Attempting to scuplt link equity in this way could cause lots of unintended negative consequences and my advice would be in most cases to let your link equity flow freely throughout your site in a way that was natural to both humans and bots alike. Best of luck!

    | Cyrus-Shepard
    0

  • This is great advice. And don't forget in Search Console (Webmaster Tools) under Crawl in the sidebar you can use the URL Parameters tool to tell Google which parameters are used to paginate. You're directly telling Google then that that's how you would like those pages to be seen. In addition to using rel="prev"/"next".

    | Ria_
    0

  • Hello, my friend. https://support.google.com/webmasters/answer/6216428#content_mismatch Read the "To Fix" part

    | DmitriiK
    0

  • The client will not remove the content until we have new content ready to go. He will not leave these pages blank, despite our advice that no content is much, much better than duplicated/plagiarised content. We don't have the resource to write all this content within a short space of time, so we're going to have to run the risk of a penalty as the client is not budging on this.

    | PeaSoupDigital
    0

  • Could you possibly add a line or two by each client? It could serve the dual purpose of getting some content in and enticing the user to click on the case studies. I'm not a fan of content just to have it, but that recent Whiteboard on "cruft" comes to mind and I wonder whether it applies here. Would be interested to see what others think. https://moz.com/blog/clean-site-cruft-before-it-causes-ranking-problems-whiteboard-friday

    | seosnyder
    0

  • Thanks Russ, ultimately I'd like to remove the low value links. We can work to increase links to the more important pages but was hoping to reduce these low value links as well.

    | Jay-T
    0

  • Second confirmation ;-), he's right. It's one of the things that for bigger sites really could get you in trouble.

    | Martijn_Scheijbeler
    0

  • You may want to see the latest google webmaster hangout . John Mueller answers a question about this type of topic. You may also want to see the seo journal article linked to this. John Mueller has also stated a number of times that there is no duplicate penalty as such but my interpretation of that is that you wont rank as well as the original source. Many affiliate sites suffer from this kind of thing as they are share the same product descriptions etc. Pete

    | PeteC12
    0

  • thats what I prefer to do - describe the image with the alt tag. When you can use the target Keyword (in step by step instruction you surely could find ways) do it, but describe the image first.

    | paints-n-design
    0

  • http://googlewebmastercentral.blogspot.com/2010/04/to-slash-or-not-to-slash.html Rest assured that for your root URL specifically, http://example.com is equivalent to http://example.com/ and can’t be redirected even if you’re Chuck Norris. Note the quote from John Mueller in the comments _Lots of good advice from an older blog post that's still valid & relevant today. _ There may be some technical merit that Googlebot starts at the slashed version on the TLD, but I do not think there is any type of SEO advantage with that (if that is the case).

    | CleverPhD
    0

  • Great thank you! I'll give both a go!

    | BeckyKey
    0

  • I think you are on the right path by paying attention to this. Of course you should start with some keyword research. I think you are correct that Brand is often searched before Type (ie: Nike Running Shoes is probably searched more often than Running Shoes Nike), and certain styles are probably not categorized at all (ie: a person searches for heels , not womens heels), while others are (ie: mens sandals, womens sandals). This, of course, can quickly lead to a problem of a confused hierarchy. This is where tags can come into play. There will likely be some contradictions where you would have to choose one navigation style over another even though doing so will force you to be less than ideally optimized for one or another. But, if you choose to noindex,follow your navigation and use an optimized tag methodology for your content that is index,follow - you won't have to compromise. Your users will navigate through the site normally, but your search landing pages will actually be tag-based categorization. This itself can be problematic for a number of reasons, but if you invest well enough into it, the solution can play out very nicely in the long run... You will want to "seed" your tag list with keywords that match significant, unique sets of your content. You will want to add unique content to each of your tag pages such that they do not overlap too much with other tag pages. You will want to limit the total number of tag pages so that you dont have PageRank sprawl (the same huge index issues that faceted navigation can present) You will want to try and align your tags with words and phrases for which your users search You will have to tag any and all products, but often you can match a category to a tag (ie: womens-shoes/sandals/flip-flops could be automatically taged as "Women's Flip Flops" Hope this helps put you on the right path!

    | rjonesx. 0
    0

  • On first use it looks pretty good. cheers Dmitrii.

    | ATP
    1