Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.

  • This topic is deleted!

    0

  • Fascinating.  If it works and suites your needs that is what matters.

    | mikeusry
    0

  • Yes thank you I think far better to have unique titles, better than duplicates even though this method a lot are going to be very short but again avoids titles that are to long :-).

    | askshopper
    0

  • Hi Cyrus, Thanks for your reply! Unfortunately the problem is yet to be fixed, I hope that my disallow will work shortly. It seems that most of the index.php links to each other internally (and from old /index.php/ pages that no longer exist), which is super weird. How google found them does not make any sense to me. I don't beleive that external sources are linking to these pages either - I mean, how would they find these links anyway?.

    | Mikkehl
    0

  • No problem my friend. You are most welcome. Best regards, Devanur Rafi.

    | Devanur-Rafi
    0

  • Hi Jeremy, Most of the time folks use inbred footer links for nefarious purposes, so Google generally tends to look down on these types of links and devalue them. In your case however, it seems you actually have a use for them as they bring you a good amount of relevant traffic to your various sites. But you have to be careful. Footer links generally carry very little value and cause more potential harm than good. And as you noted, sitewide anchor text (especially in the footer!) can get you in trouble in a Penguin kind of way. The good news is 3 sites isn't a lot, and it's possible you haven't tripped any penalty filters. I do like your idea about changing up the anchor text. Do you rank for those terms? If so, have you seen a drop or rise in the past year? It may be beneficial to experiment with images and less commercial anchor text. Seems like your on the right track. Let us know how it goes!

    | Cyrus-Shepard
    0

  • Sorry for the late reply. Feel free to send me a PM. (not sure I can help, but more than happy to take a look)

    | Cyrus-Shepard
    0

  • Hi Jorge, Good question! If I understand correctly, you have a choice between two file structures - is that right? Using option 1, is there anything that would actually live on website.com/utilities/, or is it simply a URL directory with no actual webpage? If this is the case, both option 1 and 2 would pass the same amount of link juice, assuming you linked directly to each target. That said, it's usually desirable to have: A flat architecture - meaning you keep your structure as flat as possible using the fewest amount of directories as possible Shorter URLs are generally correlated to better rankings and click-through rates. For this reason, if you are linking directly to each target, I would chose the second option if possible, although the difference it makes probably isn't that great. Hope this helps! Best of luck with your SEOl

    | Cyrus-Shepard
    1

  • I don't think so; it's still working for me.  Here it is in cleartext: http://www.bing.com/community/site_blogs/b/webmaster/archive/2011/03/01/how-to-tell-bing-your-website-s-country-and-language.aspx

    | john4math
    0

  • ...so you're not creating a hundred low quality links with the same anchor text and shooting yourself in the foot.... Is your anchor text Texas hold 'em?

    | EGOL
    0
  • This topic is deleted!

    0

  • Sorry, I don't think I explained (1) very well. What I mean is that you may want to gradually change the site architecture so that not all of the search options are crawlable pages. This could mean putting some filters in form variables, for example (instead of links). It could also mean making sure that certain paths always converge. There's no easy solution. This is a problem all big sites face, and it's very dependent on the platform/CMS. With (2), a "level" could be anything. Maybe there are major cities you need to cover but everything else could stay out of the index. This really depends on your information architecture, but there's always something that's high priority and something that's low priority. If you can focus Google on the high-priority pages, it can definitely work in your favor. The trick is figuring out how to build the logic such that you can code that dynamically. I've found there's almost always an answer, but it can take some creative thinking. I definitely don't encourage doing it manually. If the results are easy to group by city and you can code that logic, the canonical may be fine. Since the search results could be different in some cases, canonical isn't technically the best choice, but it does often work. It really depends on how different they can be, so it's a bit tricky.

    | Dr-Pete
    0

  • Thank you Kane. I know the message isn't going to be permanent, but I imagine that the client would want it on there for perhaps several weeks.

    | GOODSIR
    0

  • They are the same url with different addresses, I can tell you on my sites i add code to the htaccess file to redirect the /index to the actual home page url, this removes the fact that you have duplicate content as both pages are the same & stops both types being indexed, recommend you Google this problem to get the right code for your host server as they do differ. Hope this helps but might not be what you where looking for.

    | askshopper
    0

  • Hi add the http://yoast.com/wordpress/ seo plugin then import from the all in one once that is done hit clean up after transfer Sincerely, Thomas

    | BlueprintMarketing
    0

  • Excellent. Good luck on your climb to page 1.

    | SebastianCowie
    0

  • Hi Dan Thanks for your answer. Would you really recommend using the plugin instead of just uploading the xml sitemap directly to the website's root directory? If yes why? Thanks

    | Tay1986
    0

  • The problem is that there is little to no content on these pages, aside from the the "boilerplate" stuff - header, navigation, sidebar, footer, etc - which appears on every page. What that means to a crawler/bot, is that these pages look very similar. If there aren't plenty of unique words on the page, there's always a chance that the page will look like duplicate or thin content. How many words is unique? Hard to say, but a general rule of thumb has always been 150-200 words for me. It's possible that the number could vary depending on the ratio of boilerplate to unique text, though these days Google is pretty good at distinguishing boilerplate from content. The issue is that thin content can be a problem in a post-Panda world, although on the scale you're looking at Google can most likely deal with it just fine.. For more info, to help you make an informed judgement,  I recommend this [great 'thin content' article from Dr Pete](. http://www.seomoz.org/blog/fat-pandas-and-thin-content). Could you add some descriptive text on the client directory pages, and redirect the empty articles somewhere else (or fill them with content?) Another option for the client pages could be to consolidate (and redirect) all of the categories into one client page - maybe using Javascript to hide/show each category on request, to keep things tidy.

    | riplash
    0

  • Hello SEO5, Since I'm a virtual business wanting to rank well here in Dallas, as well as build my ranking nationally over time, you're feedback on these two examples would be interesting: http://sqmedia.us/dallas-tx/customer-experience-optimization.html http://sqmedia.us/customer-experience-optimization-dallas-tx.html The first address is the easiest to work with, as I have a keyword in each page URL, although having the keyword not so far back in the URL might be better for keyword ranking. Any thoughts? Thanks, Steve

    | sqmedia
    0