Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • It's an old proprietary filter used to try to fix rendering in older Internet Explorer browsers (IE 6 and earlier) that didn't properly support transparency in .png images. Its use is harmful for page optimisation because it blocks rendering for the rest of the page while it is processing for each element. Since most of us no longer worry about supporting IE 6 and older in our site designs, it's not worth using at the expense of all our modern browser users. Paul

    | ThompsonPaul
    0

  • Nope Neither YSlow nor PageSpeed are able to differentiate third-party content from your own hosted content in their reports. So they're giving optimisation recommendations you can't possibly implement because you don't control those external servers/code. Just ignore recommendation on any URLs you don't control. The one exception to this is when the report mentions external JavaScript that could be moved to footers, or could be asynchronous. In those cases, do try to follow the recommendations where possible. Remember, these are automated reports based on a standard set of rules. You must interpret them using common sense for your own specific site and circumstances - they're not gospel. Paul

    | ThompsonPaul
    0

  • Right. Thanks for the advice again! I will definitely go for custom meta descriptions in the near future. Cheers

    | danielpett
    0

  • thanks Gareth. That was the problem Brett

    | casper434
    0

  • thanks Klarke

    | casper434
    0

  • Thanks that is really helpful. I have the All in One SEO plugin so I think I can do this.

    | econley
    0

  • If you no follow the link you should be fine. Googles looks at too  many internal follow links on a page as an overoptimization (manipulation) attempt.

    | GoodAtMarketing
    0

  • Yes--but as the old saying goes--the "King" can't do no wrong.

    | KevinBudzynski
    1
  • This topic is deleted!

    0

  • I think that Ajax drop down option would be a good improvement from a user experience perspective. But, being a Javascript/Ajax control, I don't think it will have an impact on your site architecture from a search perspective. From Google's perspective the menus will not be populated with your product data because they will not trigger the ajax calls.

    | cogbox
    0

  • Thanks for the reply. We ended up doing it manually, as we only had a few hundred pages to update.  But I am going to keep this in my back pocket in case we run into the same issue with another client.  Thanks again!

    | ZephSnapp
    0

  • Alright, thank you veru much. The link was very helpful as well!

    | danielpett
    0

  • I'm wondering if SEOmoz is overreacting to some of my link text, or if I should alter the text? When you are targeting long tail key phrases, your main phrase is likely to be part of many longtail phrases.  In my case the homepage targets 'company formations' and the link that's been flagged in the On-Page Report is "Prices for Irish Limited Company Formation" which isn't that close of a match. It's a link in my main navigation, so whatever I change it to chances are the wording will be similar to key phrases I am targeting on other pages. Also, making sure the link text makes most sense to users is a priority! Thanks Martin, your reponses have clarified this a bit.

    | annomd
    0

  • Hey there, Thanks for the question. If you have some old deprecated URLs that On-Page is still trying to check, you can get rid of them by click the Stop Running Weekly button as seen here: http://www.screencast.com/t/r7HO1zkuS You can then either wait for them to automatically be generated for ranking keywords or manually enter them into your On-Page Report Card. I hope that helps. Cheers, Joel.

    | JoelDay
    0

  • well your root domain has a trust and rank score so if you optimize your product pages I'm assuming this is some sort of a e-commerce website. The more trust and rank will go to your website's domain or root domain the us spreading it across any subdomain you build. The times when subdomains were considered completely different than non-subdomains are over so it essentially could mean it would reflect identically. I hope this is better for help to sincerely, Thomas

    | BlueprintMarketing
    0
  • This topic is deleted!

    0

  • Thanks for the response. Appreciate it. You are absolutely right, it is enough reason. I was just curious to see other reasons to why having unique meta descriptions is the way to go, from someone more experienced!

    | danielpett
    0
  • This topic is deleted!

    | Xmedia
    0

  • The site isn't low quality (there are no ads and they don't sell anything -- it is a scientific site) -- it's just that EVERY link is available as a secondary or tertiary link. My initial thought is to simply get rid of the tertiary level within the main nav, cutting out roughly half of the links. On any inside page, they are available on a left-side nav anyway. The smallest number of links is about 110, the largest is pushing 250. I just wondered about everyone's opinion. Rendering the menu via jQuery as Chad suggests might help. This is a wordpress-based site so I'll have to really look into it as they have to edit it too. We've already begun mapping out clicks as goals (conversions) within GA.

    | digimech
    0

  • Ahh, I see! Thanks a lot. Really appreciate it. I also found from reading one of evovlingSEO's blog posts that with the help of checking my google webmasters account for any reports on duplicate content, I could see if Google had found any duplicate content. There was no reports on this, so I guess it could be Roger crawling pages that Google don't? But I can see from viewing my source code that the code snippet you suggested me to add isn't there. I will get back when I know if it's been solved or not for sure! Thanks again.

    | danielpett
    0