Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • I recall seeing a question you asked about Google + a day or so ago, but I did not know the answer to it. Sorry. I would suggest you at least provide a link to the question otherwise no one will find it.

    | RyanKent
    0

  • As John mentioned page rank will be minimal and new domains won't have much domain rank either (likely a lot less than your main established domain) if you're wanting them for organic search purposes. Unless you're wanting to keep building upon the domain and establish them as more than 'one off' landing pages (link building etc) then using them over your main domain wont do you much good short term or long.

    | Daylan
    0

  • One other thing to consider is that you have a lot of that text in graphics.  I would especially get your links on the bottom (sitemap, stamp atlas, help, etc) out of the images and onto the page so it can be read.  Check out Google Webmaster tools and view your page as Googlebot. That can really give you a good idea on the best way to optimize your site.

    | TroyCarlson
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • Should i worry about this or will simply the otherone overcome it in time ? You have properly 301 redirected the non-www version of your page to the www version so it will fix itself in time. Your site has a very low DA so Google may only visit you once/month. If you would like to resolve the issue faster you can log into Google Webmaster Tools and set the preferred version of your site as "www". It is not necessary to make this change if you properly redirected your site, but the results would update faster.

    | RyanKent
    0

  • Yes it makes sense. Incoming links do contribute to your domain as a whole, but if you're targeting competitive keywords you're likely to need some links to the pages themselves. Also, if all of your links go to one page of your website, it looks suspicious to search engines, much better to spread them out a bit. Paul

    | PaulRogers
    0

  • One method: Install JMeter on a remote machine in a VERY remote location and perform regular performance tests with it. Downside to this method: Values may not be exact as JMeter is not a browser but a site testing tool, so values may differ significantly from your browser experience. http://jakarta.apache.org/jmeter/ Another method: Webservices such as yottaa.com. They offer performance testing and monitoring. I think they also might be using a JMeter implementation, at least it sounds like it www.yottaa.com This may be the easiest solution, and I'd definitely give it a spin, but maybe you shouldn't rely on their data as your only source. Third method: Build your own testing tool based on Googles PageSpeed SDK. http://code.google.com/p/page-speed/ That would be my favourite, but you need someone with coding experience Sure, these were all rather advanced options (except for yottaa.com) but if you want to have reliable data you should take your time and use a proper tool.

    | akaigotchi
    0

  • Hi guys Here is a video of Matt Cutts talking about this matter. Hope it helps.

    | CPU
    0

  • Some companies have a lot of sites covering various topics, for example, http://ninemsn.com.au/ The "100 links per page" is a very old rule back from the days when crawling web pages was much less sophisticated then it is now. Search engines have the ability to crawl hundreds of pages. The example you are using is a site with a DA of 85 and is in the top 1000 most trafficked sites in the world. This site can support the crawling of 100s of links. Unless your site has an exceptionally high DA, you probably want to reduce your links to the minimal amount necessary to ensure a quality user experience. As you examine your links, determine which links are actually used. There are tools such as CrazyEgg that can help you evaluate your site. Imagine a site with 1000 pages and a link to all 1000 pages from the navigation bar. What you are telling search engines and users is that all 1000 pages are equally important. That probably is not the case. The most important pages and categories should have a link, but the lesser pages would require an additional click. Should these headers be implemented in javascript? Search engines can crawl most javascript. The best practice would be to reduce the links as mentioned above. Search engines reward sites which improve the user experience by providing higher rankings. If you offer 200 links and 150 of them are not used, you are bleeding PR and your site will not rank as well overall, which is by design. I'd prefer to reduce the number of links, but sometimes company policies don't allow this. Your role as a SEO is to educate the company on the importance of the changes you recommend. If a company refuses to implement your recommendations, then there is not much you can do about it.

    | RyanKent
    0

  • You're welcome! I understand your problem, and the canonical cannot solve it. It only works for internal duplicate content. I saw something about a external canonical, but it looks like you're going to pass your relevance to another site, and you don't want that!  ; ) Do you and your competitors have too many duplicated content? I would suggest you to write your own content in the cases of external duplicate content, and this way go ahead of your competitors! Have I answered your question? If so, click the Good Answer button! =]

    | seomasterbrasil
    0

  • Just did a spider of your website and you have way too many subdirectories on those pages, which make it less likely for search engines to crawl and index the page. For example:  http://www.movstore.com.br/Departamentos/+Quarto/86/0/0/0/0/0/0+40+Data/ By having 9 subfolders you are suggesting to search engines the Department content is buried deeply into your website and less relevant. By comparison, you product page only has 5 http://www.movstore.com.br/Produto/Quarto+Infantil/Guarda+Roupas/Ponte+Juvenil+Teen+Branca+/6417/ Based on this directory structure, search engines think this page is more important than your category pages.  Anyway you can reorganize the content the product pages would be below the category-level pages instead of under their own product directory? For example: /Category /Category/Product/Sub-Product/Product-Name?

    | Conor_OShea_ETUS
    0

  • I wouldn't spend too much time on the second (hyphenated keyword) domain, you'd be much better off focusing on building great links to your primary domain. Also, domains with hyphens are signifcantly less effective than the exact match variation, I have conducted a number of experiments and the non-hyphenated versions of the website are much, much better! So I would just redirect it as it is and focus on the one main website URL. Paul

    | PaulRogers
    0

  • Half our domain contains a very prominent keyword for our business. The second half is less so prominent. Few to none would use a search exactly like our domain name to find our services. Did you ever consider moving your site from www.k-w.com to www.kw.com after you bought it? This is the second part of my quandary - even if I pay the $24,000 that is being asked for www.kw.com, I still have to consider whether constantly quoting to people 'oh it's www dot keyword hyphen keyword dot com' is worth it, and whether the negatives of having a hyphenated domain outweigh the negatives of losing rankings for ages by moving. This is moving away from the original question a bit, and though I'd love to discuss this with you further, I understand if you don't have time.

    | JacobFunnell
    0
  • This topic is deleted!

    | axt
    0

  • This was the approach we were considering. However, what gave me pause was looking at other social networks are doing. LinkedIn only has 50k profiles. Meez has multiple index files and looks like they are indexing many of their profiles Facebook and MySpace don't appear to use Sitemaps (although they do have a crawlable member directory) WeeWorld has 500 links The point is there doesn't seem to be any consistency. If we end up creating a human readable directory ala Facebook does that achieve the same thing?

    | JoeCotellese81
    0

  • CopyScape will let you do it manually, but they also have a paid service called CopySentry (http://copyscape.com/copysentry.php) that will run automatically and notify you of new instances of duplicate content on a regular basis.

    | AnthonyMangia
    0

  • That is what I've been reading everywhere, so thank you for making me feel more comfortable noindexing those pages. Thank you so much.

    | lwilkins
    1

  • Yeah... if you want the mobile version of your pages to appear in the mobile SERPs, I think you have to live with these errors.  If you'd prefer your regular page to appear in the mobile SERPs, and you'll redirect the user when they get there, then you could rel=canonical your mobile site pages to their corresponding www pages, which should take care of these errors.

    | john4math
    0