Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Google actually crawls 150kb, excluding css files, images, etc. 150kb is much more than 200 words, and the experiment suggested by Mr Bennett proves it.

    | NatanValencia
    0

  • To reiterate what Dave and Karl have said, don't worry about the number of links, worry about user experience. Do you feel that the navigation and design you've opted for delivers a great user experience? As an aside, the average ecommerce website product category page has 3 links to every product; 1) product name, 2) product image, 3) call to action. It would be a poor user experience for any of those not to link to the product page.

    | johncallaghan
    0

  • No, doesn't mean single word, simply means just one set of tags per page. The crawl is presumably finding pages with more than one h1 level title on the page.

    | WorldText
    0

  • ñô sorry could not resist that. I would test it on other more well know examples, if you keep coming up with 0 and you still find pages with the special spelling in results then I would suggest Google copes with it, having said that A B C will rank better than C B A for the team "A B C" , having exact may always have a slight advantage

    | AlanMosley
    0

  • I would not worry about moving the code, I would use html5 elements to tell the search engine what is what. Such as <header> <nav> <menu> <aside></aside> </menu> </nav> </header>

    | AlanMosley
    0

  • Hi Takeshi, now I realized that my client is offering 4 tours in the amazon. He is just mixing the attractions and length of the items to create different tours. So there is a lot of duplicate content which may result in his bad pagerank (although his pages are ranking). On the one hand its good to see different combinations of itineraries but I think the concept of such a site should be different. Each attraction with unique content should have its own page to be able to focus on different keywords. Then he should propose possible combinations of these attractions without repeating always the same content. This will be the only way to make the site better... right?

    | inlinear
    0

  • If you corrected what it tells you... it should go away. Perhaps you just did the update and rogerbot hasn't crawled since? Always wait for the next week to see updates in those rankings, in case you did not know that.

    | jesse-landry
    0

  • First of all, do not, I repeat DO NOT nofollow your own internal pages. Doing so will not result in your other links getting more link value, but will result in link juice evaporation: http://www.seomoz.org/blog/google-maybe-changes-how-the-pagerank-algorithm-handles-nofollow If don't want to divide your link equity, the best solution is to simply remove the link. I would encourage you to install a tool such as CrazyEgg or ClickTale that will show you exactly how your users are using your site. Chances are they are not clicking on many of those footer links. Based on that click data, remove the links that aren't being clicked on. Also, having a few more links than what SEOMoz recommends is not the end of the world, especially for a high Domain Authority site like yours.

    | TakeshiYoung
    0

  • I have used unique identifiers to create unique page title before but only because it's a huge site that required automation - it's not the most elegant way to add dynamic unique content (but it does insure it which is why it's a good technique for huge sites) and could reduce CTR. the problem i see here more is that both title tags are competing for the same thing. if users are able to generate similar description resulting in duplicate title tags that is probably very rare and you can probably go in and edit the ones that are duplicates by changing one character.

    | irvingw
    0

  • Hi Scott, What you're suggesting - somehow assigning each blog post to a specific location - doesn't make sense to me for taxonomy purposes or for SEO purposes. Assuming you assigned the posts to a specific taxonomy for locations, whether that looked like Tags or Categories, it provides little value aside from adding a small amount of text to the page. Internal linking is one way to take advantage of the fact that you have that much unique content, but that really boils down to 2 tasks: Ensure that each of the 4 locations has a dedicated location page (eg domain.com/locations/seattle-wa/) that is linked from your nav menu or sidebar in order to make sure every page on the site links there. Be mindful of the anchor text you use - it's hard to do an anchor text heavy with keywords for a location page without it looking spammy). Add links within the body content of the page - when it's relevant to do so. You can also begin to produce blog content that is geographically relevant as a way to directly build links to these location pages. For example, if your 4 locations were paint supply stores, and one location was in Seattle, you might create an article on "18 Seattle Homes With Crazy Paint Jobs" - this would be an effective piece of content to link directly to your Seattle location page. You'd want to do it in a natural way. Here's a good vs bad example: Good: "We sent out 2 paint consultants from our Seattle location to scout out the most off-the-wall paint jobs in the Emerald City." Bad: "Read more about our Seattle paint store."

    | KaneJamison
    0

  • Will do! It's weird now this morning it's showing two keywords that I ran manual report on yesterday in the results... Support will help I am sure Thanks!

    | jkatzer
    0

  • I have changed title tags on several sites, several times and those sites never experienced any kind of negative impact. There was one instance in 2009 when I updated a title tag on an ecommerce site. This was one of many changes made at the same time. One of the changes I was "directed" to make was to add about 6-8 really spammy direct links using exact match anchor text to the home page. The site dropped from 1st position for its brand name. The cause was determined to be my optimized title tag....not the spammy links the CEO decided to add to the home page. Within a few days the site was back in the number one spot. I think this is a good example of the correlation = causuality Brandlabs mentioned above. Just because I know Brandlabs specializes in Volusion, you might find it interesting that the scenario I just described occurred on a Volusion store.

    | danatanseo
    0

  • Hi Jake, Thanks for the question. First off, I highly recommend reading this article by Dr. Pete: How Many Links are Too Many? (he get's asked this question a lot!) Short answer: over 100 links isn't necessarily bad. If you're consolidating those links via rel=canonical Google is likely doing this in their crawl, and therefor reducing the crawling of every single URL in order to save bandwidth. If those pages rel canonical to each other, you may not have a problem. I would recommend checking your parameter settings in Google Webmaster Tools, just to be safe. That said, I've often found that reducing the number of links on a page often improves user experience. There's a certain psychology around limiting choices that makes decisions easier for users. I'm not saying this is right for your situation, but it's always best to question your assumptions. Cheers.

    | Cyrus-Shepard
    0

  • Yep, this would work within the travel sector - The UKs No1 Destination etc etc. I think it would only benefit the CTR, if the offering is indeed the best, or the client is happy to claim they are the best at what they do. Thanks Guys

    | Webrevolve
    0

  • Makes sense, I absolutely don't want to chance with the menu and have it mistaken for cloaking. We will now look at other solutions for a more traditional menu with better internal linking and less links. Thank you for your input!

    | AJPro
    0

  • Thank you for the responses! Greatly appreciated. I'm just wondering if someday Google will see this as dup content...as it does written content. Why is it that News sources can syndicate content, but if I put that content on my blog, it's duplicate content?

    | RoxBrock
    0

  • I wouldn't re-write old posts. If they can be refreshed or added to with recent updates go ahead and redirect (if it can be redirected without losing any additional info) or link to the new version. Things get tricky if there's nothing new that can be written about the post. First, kill the really bad stuff, as Mike suggested, and keep the good stuff. The stuff on the borderline is probably not worth keeping unless it was still receiving traffic. In my experience with Panda, using 410s on bad pages is better than redirecting, but you will probably want to 301 redirect to the next-best page if you have good links. If it was still receiving organic traffic, think about what you can do to make it better or provide additional resources and reading. Try to save traffic-generating pieces by improving them and making them useful to the people who were landing on them. For high-traffic pieces, you will want to look at the organic keywords and make sure the page somehow answers the query. As always with Panda, make sure your design doesn't turn people off and that you're not filling the template with too many ads.

    | Carson-Ward
    1