Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • This shouldn't be too difficult a change then. We only have around 20 articles that were hit by this particular issue. Thanks for the response, and everyone was very helpful.

    | Izoox
    0

  • I see this is a Wordpress site. Running a Web Page Test your real problem is the Time To First Byte - generally a poor figure here points directly at bottle necks with your server . I see that you are using a number of jQuery scripts locally instead of through CDN.  The first thing I would do would be to go into the function.php file and deregister those scripts and then register them so that they access the google depository for jQuery scripts and others. This will help by decreasing wait time on parallel http request and that you will find some of the vistiors will have those scripts already cached.  All this sounds complex, but it is really simple to do. You can use the add_action('wp_enqueue_scripts', 'your_function_name') to wp_dergister_scripts and wp_register_scripts .  You just have to look for the name space that the theme registered the script to de_register it correctly.  (lol this sounds hard) easiest one is the jquery most themes use the 'jquery' for the namespace so it would be like this i: add_action('wp_enqueue_scripts', 'new_scripts'); function new_scripts() { wp_deregister_script('jquery'); wp_register_script('jquery', 'http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js', false, '17.1'); wp_enqueue_script('jquery'); } Because of the issue with TTFB,  I would look into is a relatively new WordPress Centric hosting.  I've just started using them and they are very good .... WP Engine  they can help with the TTFB  issues as they are running varnish cache and supposedly have something even faster in the works called TachyCache.

    | Runner2009
    0

  • TTS- Hosted version can be set up to pass you all the SEO value but it has to be done right...... Wordpress will help you in SEO and ease of use in maintaning the blog.......always been a great blogging tool and now is also a great CMS. Doesnt solve your issue, but if you could figure out how to use it, you will be much better off.

    | Mark_Jay_Apsey_Jr.
    0

  • They most likely launched both in hopes of getting US and UK traffic to their site. The sites are not identical and therefore they are doing nothing wrong from a traditional SEO standpoint. Now, I don't agree with their idea of having a .co.uk and a .com, they should be fine with one of those. I am with Miriam, this should not be allowed, but I can see how it was missed by any algorithm catches.

    | katemorris
    0

  • Hi Nakul, I appreciate your willingness to help! We actually resolved the issue with help from our developer - standard 404 page - both to viewer and bots - but we've implemented a routine to regularly search for viable redirects to eliminate as many as possible. On a related note - pretty good post on SEOmoz blog today about this very topic - coincidence?!

    | Blenny
    0

  • That would be having it twice, I would not recommend that. However that's debateable. IMO one mention per keyword in the URL is plenty. For regional links focus on building links locally (by area/regional physical location) to each page.

    | SEODinosaur
    0
  • This topic is deleted!

    0

  • What is the goal of the sitemap.  And I'm assuming you are talking about a html sitemap, not a XML sitemap. If it is for users, guide them to the best category pages. If it is for SEO, don't have one.  Focus on building organized XML Sitemaps.

    | Copstead
    0
  • This topic is deleted!

    0

  • yes, we have resorted to this. We have a canonical tag to the blocked .com version of the site so hopefully that gets rid of the page. The client want's it removed 'yesterday' but only so much I can do when I don't control Google personally! Thanks again for the advice

    | Grumpy_Carl
    0
  • This topic is deleted!

    0

  • I am not a fan of CMS, i realize there are pros and cons, but when you try to do too much and be all things to all people you tend to have a lot of compromises. There is one other reason i dont like to use robots,txt, i remeber Matt Cutts saying that it is a spam signal because they can not see what you are hiding, not that it is going to get you flaged by itself, but with other signals it can. If i remember correctly he was talking about hiding malware in scripts blocked by robots. If you are interested, the best CMS for SEO i had found was Orchard CMS but even that has some silly errors, it puts more then one H1 tag in pages, but is still the best solution I have looked at. It is more customizable via code.

    | AlanMosley
    0

  • This was raised as a potential issue in another thread I was contributing in, and its better to follow, noindex category pages, paged > 1, and author archives to help avoid duplicate content issues. It's better to build this functionality into the theme because if you suddenly decided to change the WordPress plugin from Yoast to something else then the new plugin may or may not provide a means of either canonicalising or noindexing the relevant pages. I wrote an article about this here: http://www.laceytechsolutions.co.uk/blog/wordpress-development/avoiding-duplicate-content-in-wordpress/ I know plugins are persistent through theme changes and that is why they are popular with most users but for something like this hard coding seems to be the best way to go. In my blog post I use the following code): 1){ echo ''; } ?> '; } ?> '; } ?> This could be condensed as follows if you want to use all three: 1 || is_author() || is_trackback()) echo ''; } ?> I hope this helps For the record, for my blog SEO Moz is showing no errors and no warnings across the board.

    | blacey
    1

  • Thank you for your response Mark.  I have bought a domain name that better suits the overreaching purpose of the website.  www.myride.co.za,  the domain is 5 years old and sounds good.... I think;)   Now one more question, Should I use www.myride.co.za as a landing page then link to the 3 individual niche sites through the main menu or should www.myride.co.za be the main website, and when the user clicks on | Taxi Cab Hire | Limo Hire | Party Bus Hire should that point them at the domain relevant to each of those businesses..... eg user clicks on party bus hire and get sent to page that looks the same but hosted under www.mypartybus.co.za the only commonality of the 3 businesses is that they are chaffered transport.... but the businesses themselves hit 3 different industries, thus my choice of www.myride.co.za for main page I know this is complex , but I want to start right from the start, any help from this community would be appreciated! I guess to make it clear the best thing to do is visit my current main page www.mytaxicab.co.za I am putting a big drive into SEO and PPC and need somthing that will work well for both as well as link building etc etc Thank you so much Chris Irwin 1cNZI

    | chirsirwin
    0

  • With a quick glance at both blogs, it looks like you're doing just fine. Both have a distinct look and feel (the Boy Scout blog is by far the more attractive - but that's just my opinion) You might get a higher click-through rate if you were to move the Boy Scout Blog to it's own keyword rich (and shorter) domain, but the hassle and potential downsides of moving it means it probably isn't worth the hassle. Regardless, if you've been doing it for awhile and it's working for you, I wouldn't change a thing.

    | Cyrus-Shepard
    0

  • Are you sure you don't have noindex on your tag pages ? PM me the URL. Is it wordpress ? Are you using any SEO Plugin ? Some plugins have a noindex on Tag and Archive pages by default.

    | NakulGoyal
    0

  • It's just another link. It's completely navigate-able and they have canonical tags in place to prevent any duplicate URLs. It looks good. http://www.ties.com/long+black-ties Canonical is consistent with the linked URL.

    | NakulGoyal
    0

  • It's SEO best practice; doesn't mean that it won't effect something, but it's in your best interest. I think this Moz blog can say it waaaay better than I: http://www.seomoz.org/learn-seo/duplicate-content

    | josh-riley
    0

  • It can take a while. I disagree very slightly with Alan and EGOL on one point - while 301s are traditionally more appropriate here, I often find that canonicals are pretty strong (and more than a hint). Both suffer the same problem, though - the signal has to be crawled and processed, and that doesn't always take right away. I haven't seen any reports on it taking 2, 3, etc. times to happen, but I've definitely seen a page re-cache without the indexation signals beign honored. Are these true duplicates or did something change in the interim a bit? If the duplicates don't seem like true duplicates or you put 1000s of them out there all at once, Google could choose to ignore the canonicals. If these really seem stuck, though, switching to 301s is harmless, and for a permanent URL change, it is probably the better way to go. I wouldn't expect that to kick in instantly either, though.

    | Dr-Pete
    0