Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • In my experience with content management solutions, it's easier (and thus more common) to keep all blog posts under the blog directory, then use categories in the URL and/or the Title tag (if necessary at all). That said, most of the time I find categories to be more useful for users, and less helpful for engines. So, to me the pros of using /category/content-title would only apply if the categories are very straightforward and short (which depends on your industry). For example, if you run a outdoor hobbies blog, and you have clear, distinct categories like backpacking, camping, hiking, fishing, and so on, then using the category URL structure would be great. However, if your blog is more specific (say backpacking only), your categories will start to become more convoluted and long, taking up a large part of the URL. In addition, this can cause unnecessary redundancy in your URLs, which can appear like keyword stuffing. Overall, I prefer to stick with /blog/content-title, and possibly include the category in the title tag if it's really important.

    | conquerapathy-64412
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • Hi Keri, that's an interesting idea. I thought about it: Right now, I don't feel ready to write a post like that. I'll keep it in mind though: If I have to move a site again I will prepare it better, do more detailed monitoring and if that works, I'll have material to write a post that's worth reading Thank you! Timon

    | ie4mac
    0

  • The logo actually reduces a whole lot via smush.it, so that can help a little. Have you looked at the graphic elements that came with the theme that you didn't necessarily create? Yahoo's Y!Slow extension for Firefox can also point out a lot of other areas for speeding up the site.

    | KeriMorgret
    0

  • I would probably hire someone from oDesk to do the work.

    | AdamThompson
    0

  • For a site your size, I think the left side nav is fine (for SEO purposes). From a usability perspective sometimes its nice to have a mouse over link in the main nav to give visitors easy access to category and sub category pages but that's just my opinion. By the way, love bigcommerce! We work on ecommerce sites 24/7 and that is by far my favorite for small to mid size sites.

    | iAnalyst.com
    0

  • Never had this problem but was recommended by Will Critchlow to use HTML encoding for special characters He said that they "tend to advise going for ASCII only" and "HTML entities should generally be fine" So perhaps all instances should be changed to .

    | CPU
    0

  • Jason, if you can post the contents of your robots.txt file, or give us a link to the site in question, we can help you diagnose what is happening. A second question is -- what type of content is being blocked? If it's a directory like /admin that is being blocked, the robots.txt is likely working as intended. You can also verify your site in Google Webmaster Tools and look in there at the crawling section, as it will tell you what pages Googlebot hasn't been able to crawl. Google offers some help at http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-of-robots.html.

    | KeriMorgret
    0

  • Hi Claudia Your home page has both domain and page authority and has thousands of links to it. The job scheduling page doesn't have any page authority and has no links to it as far as I can tell. The job scheduling page has a URL of www.uc4.com/what-we-do/business-automation/job-scheduling.html where www.uc4.com/job-scheduling.html or www.uc4.com/products/job-scheduling.html would be more search engine friendly. Just to let you know it took the homepage 150 seconds and the job scheduling page 180 seconds to load up. Please note that is seconds not milliseconds and even then the site didn't load up properly.

    | CPU
    0

  • I'd be careful with that footer you've got. I might be behind the times, but I think that could be considered keyword stuffing which, last I checked, Google doesn't look to nicely upon.

    | PhoenixLander
    0

  • Also see Rand's recent update to the search engine ranking factors.  Keyword usage in <title>, <h1> and on-page content are all mentioned and measured.</p></title>

    | jcolman
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | mjk26
    0

  • Hi Dstrunin, I would still use the rel canonical tag even with or without the filter in place.  So if you have a list of products displayed unfilter at companyname.com/productcategory/page1.htm, I would add a rel canonical  with it pointing at companyname.com/productcategory/page1.htm.  For the filtered results,companyname.com/productcategory/filters/page1.htm , the canoncial tag would still point to companyname.com/productcategory/page1.htm. It doesn't hurt to have a canonical tag point to the same page it's on. If you can't do that I would meta noindex those filtered pages and remove the robots.txt stuff.  Robots.txt doesn't tell Google they can't index it it only says they can't crawl it.  So they could still index old stuff they crawled before you did the robots.txt stuff or index the title tags. Casey

    | caseyhen
    0

  • Barry Schwartz has written about the last part (when it's not a DMOZ listing) at http://www.seroundtable.com/google-title-selection-12989.html. There have also been several comments in Q&A about Google not using either the site's title tag or the DMOZ entry for the title in the SERP.

    | KeriMorgret
    0

  • How often do people repost the same ad after they expire? Or do the ads expire? I'm thinking of Craigslist, where the ads expire after a certain number of days, and people repost the same ad. I would think you wouldn't want four copies of the same ad indexed, and especially for people to come to an expired version of the ad when there is a fresh one. Also, you'd be competing against yourself with multiple copies of the same content. You'd need to make clear to the user that the ad could still be indexed, and give them a way to totally remove the ad. I agree with the other comment about wanting to strip out any contact information. I don't know the best answer from a technical perspective (though I'd lean towards a), but wanted to point out some implications of other solutions.

    | KeriMorgret
    0

  • Hi Mike, From what I understand, there is no definitive time frame for how long it takes to update the webmaster  diagnostics. Check and see if your changes show up in the Google cache first. If nothing is changed on the indexed pages--something is wrong with the crawl of your site. If the changes are reflected in Google's cache, but are still missing from the webmaster diagnostics, I would just re-verify your site to get a clean start. One other thing to look out for: is your non-www domain 301'd to the www domain? If not, your webmaster tools may be running diagnostics on the wrong URLs, rendering your changes useless. Good luck! If all else fails, just wait it out and take solace in the fact that you've made the necessary changes.

    | jsturgeon
    0

  • When I looked at your site, changing the criteria changed the listings on the page, so each page was unique. However, I'm guessing 100% of the listings can be accessed by just clicking through the pages of results without changing the criteria? If you decide the best approach is to block the different versions of search results pages, I would consider using the canonical meta tag to specify the canonical (main) version of the page.

    | AdamThompson
    0