Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • ...as you go deeper and deeper into the site, the link juice is divided more and more. My question: Is this really true or just a concept? I believe that this is true.  The more internal links you have hitting deep pages the more importance they will receive in google's rankings.  But, in addition to straight linkjuice there is also anchor text value. I don't want to guess and give you a generic answer to your question on what to do.  If this was my site I would be looking at analytics to see where the traffic is coming in and where it is coming from.  Are those deep pages pulling any visitors from the search engines. A more important problem that you might have is thin content.  Lots of your pages have very few words on them.  Some of your categories are empty.  You might have a problem with Google as the recent Panda update is believed to be demoting sites with thin content.   Also, I have found that increasing the content on a page from a few words to a few sentences to a few paragraphs often results in a progressive rankings increase in low competition niches. I have a blog that gets a lot of short posts and we regularly delete a lot of posts and redirect the URLs because they have a lot of value for a short period of time but they are not "evergreen" content (declining traffic).  We do that to conserve linkjuice and to avoid being seen as a site that has a ton of trivial content.

    | EGOL
    0

  • I had the same problem on http://www.tokenrock.com because I was doing a lot of URL Rewriting, it's a CMS system I wrote, but the same issue apply. I went from 7000+ errors according to SEOMoz, and I'm down to 700. Here's a few things I did: Use canonicals on everything you possibly can. Redirect 301 the items in the SERPS that are identical. I'm not familiar with Magento to help you work though that side of it. Having a link like: domainname/leather-chairs-244-16-price-1.html would work much better. The ones you have listed are because somehow somewhere you (the site) have a link to it. Unfortunately some of the CMS's are written by developers who don't fully understand SEO and why the ? is a bad thing.

    | sferrino
    0

  • Thanks for the quick reply.  The pages aren't identical.  I've managed to get 100-150 words of unique content for each but it's very dry and not great.  I could certainly do better, but not on each page, only one or two. I think I like the 301 idea.  I 301 the pages, take the old ones down and bolster the content on the master page.

    | vforvinnie
    0

  • It's definitely a judgment call.  There is legitimate reasoning to have a link to the widget author/creator if it's a single link.  And providing the option to not have the link be there helps.  Yet it's not 100% clear that Google does or does not penalize for this.

    | AlanBleiweiss
    0

  • Hey Jorgediaz, first off I think it would be wise to add the canonical tags specifying the primary URL for all of your pages, additionally it wouldn't hurt to add the parameter in question to your Google webmasters tool letting Google know to ignore your affiliate parameters.  You can find that in the Site Configuration settings under the 'parameter handling' tab. I personally woudln't worry too much about the 'loss of link juice' since I think what Matt Cutts is talking about is more duplicate content that results from shopping carts that might serve up a very similar page based on a filter (such as re-ordering products by price).  In my experience affiliate links aren't the greatest in the first place, many are probably even using your publisher ID sending the link to an intermediary source for tracking purposes, so to recap, if it were me I'd add the canonical, add the parameter in your webmaster tools and leave it at that. Hope this helps.

    | KT684
    0
  • This topic is deleted!

    0

  • As David points out the meta keywords tag is ignored by every search engine that matters. If you poke around the SEOmoz tools you'll notice they recommend getting rid of them, only to prevent your competition from sniffing out your keywords (they still can, but they have to put more effort into it). Post tags, in a wordpress context, is more a way of organizing your content for the benefit of your human readers, not the engine crawlers. A lot of blogs will use a tag cloud to show topics that are often covered, provide a cool visual element to the page, and to help their readers browse to stuff they want to read. There is a potential to hurt your SEO with these tags. If you're on wordpress I recommend the SEO all in one pack plugin. Using this you can mark your tag pages (and author pages) as NOINDEX to prevent any possible duplicate content penalties.

    | AdoptionHelp
    0

  • Yes it can, what technology did you build your site with, I am a .Net developer I cam instuct you if it is .Net, someone else can do so if it is not .Net

    | AlanMosley
    0

  • Mmm... yeah hard to guess without looking at the site then, on my own experience / research, these are some of the issues I found in many of the sites affected by Panda: •Intrusive advertising, excessive use of Adsense, sites created only for Adsense or to solely promote a product•High amounts of duplicate content / scraped content•Bad user interface / “ugly” design•Usage data - low click-through-rate, low time-on-site, 100% bounce rate•Content analysis -  not usable/readable/easily-consumable content•Excessive internal linking to one or two pages only And I don't mean your site to be spammy, but some cases, like news sites with advertising, sometimes they get articles out with just a couple paragraphs of content, so that single page becomes more advertising than content. Consider posting your site, it would be nice to take a look and there is also the last reason: your site is innocent and just got hit by mistake, it happens.

    | andresgmontero
    0

  • I've begun using this tool to compare pages for duplication. On the page they say 80%+ is duplicate, but I would be far more conservative. http://www.wordsfinder.com/tool_duplicate_content_checker.php

    | AdoptionHelp
    0

  • Yeah - thats what we thought, we were surprised not to be able to find any links or anything on it!

    | AxonnMedia
    0

  • I can not understand the difficulty of the web designer. Also do not understand why he speaks in the DNS migration if there are two different domains. Well, what he has to do is an exact copy of the site for the new domain in a new host plan. After the copy being hosted in the new domain, it does a 301 redirecting the old domain to new. So all pages will redirect to the new, including the intern will be right and you will not lose rankings.

    | j0a0vargas
    0
  • This topic is deleted!

    | MikeVH
    0

  • If we forget the anchor text for a sec and focus on PageRank only, it should split up in two minus the dampening factor (0.85) and then converge back into the same URL. If only one link to the same URL from each document counts for PageRank passing I would like to see a study or a test that demonstrates that.

    | Dan-Petrovic
    0

  • Sites that work hard at Web Accessibility often reap large SEO benefits. If you think about it, search engine crawlers have a pretty limited view of any give web page.  They can't see the images.  They work mostly by indexing text strings and ignoring large components of any web pages presentation. There's a ton of simple tips for accessibility that help with SEO.  For instance, add captions to your images and create navigational components out of HTML lists. Google's SEO Guide touches on Accessibility The W3C has Accessibility Guidelines for HTML which is also worth reading.

    | TaitLarson
    0

  • It's important to have every physical location represented in Google Places.  Ideally you should have a corresponding page on your site for each of those locations as well, that can be navigated to from your site home page through standard HTML links internal to the site (not through javascript or AJAX, etc). You should submit a separate google place entry for each - use the bulk submission process.  Each should have it's own local phone number and each should be given maximum Google Places optimization treatment.  The title for each should all match your business name though - don't try and get a unique title in each individual location's places entry. The more actual locations you do this with, the more likely you will come up in exponentially more local searches. Don't forget to follow this up with submitting them all to YP.com, SuperPages.com, Yahoo Local, Bing Local, etc.

    | AlanBleiweiss
    0

  • Hi Joe, this is for sure an awesome question, so many different point of views, the problem I see with .co is this one: "Sites with country-coded top-level domains (such as .ie) are already associated with a geographic region, in this case Ireland. In this case, you won't be able to specify a geographic location." Source: http://www.google.com/support/webmasters/bin/answer.py?answer=62399 So if I understand this correctly, and you want to target real estate clients in the Chicago area (which I love and will be there for the U2 concert on July 4th) and over US/worldwide, a .co domain is probably not the way to go here. There has been a lot of talk about .co (TLD for Colombia), same as .ws, supposedly "WebSite", actually West Samoa, so I would advice to make the obvious, look at your competitors, does anyone has a .co domain and are ranking in Chicago? are any of the top 100 results anything but .com? try different keywords just to check if there are any .co sites ranking in the real estate market. Hope that helps!

    | andresgmontero
    0