Ajax Server Snapshot Setup...
-
Hi All,
Having just joined we have noticed that we have an alarming amount of 'Too Many On-Page Links' we think this is because of our large navigation menu which can be seen here... www.dirtbikexpress.co.uk
We have had our programmer set up some AJAX filters for the category & subcategory pages but are worried that the snapshot or the ugly url that is returned is set up correctly - ie it is just a ugly url duplicate of the same (complete) page does this just need to return the content that has changed?
Basically we want to ensure our AJAX and navigation menu is correct as we want to have the same navigation and filtering as many large site do now - John Lewis, Debenhams etc etc.
We are finding that some pages are struggling to index and products are proving very tricky. Also our indexed pages against our submitted sitemap in webmaster tools is poor in ratio (6000 pages over 1000 indexed).
Any help would be really really grateful.
Kindest regards
Mark
-
First off, the 100 links thing isn't a law written in stone. SEOmoz's tools do yell about it if you go over 100 links. This "100 link lore" comes from a Matt Cutts blog post:
http://www.mattcutts.com/blog/how-many-links-per-page/
If you look close, you may notice that there are more than 100 links even on the page that Matt wrote about this. It's kind of a loose guideline in my eyes. From my own professional experience, if every page on your site has 500 links, you're going to hurt for it. But if you have 125 links on quite a few pages, or put out a blog post that's just an insane resource that links to a few hundred people, you'll still be just fine.
What it seems like they're going for here is just an indicator of quality and usability. People do really abuse internal linkage, and Google needs to make sure that those people hurt for it. Without having seen your site, I'd just say, think about what your users really end up needing on every single page of the site (usually it is much less than 100 links). Often times, a simpler navigation is the better one. Look at what pages people actually reaching in Google Analytics, or setup a heatmap to follow their behavior. Tweak it. Annotate it in Google Analytics. See if pageviews or other key goals improve.
As for only seeing 1k pages out of 6k in the index, I'd again take a close look at what is actually of value. If you have a lot of duplicate/thin content, you may be best off just using noindex,follow tags on some of it, to improve Google's perceptions of your domain's quality as a whole. If they are all of value, you could have other IA issues. One test site of mine has roughly 1,000,000 pages without noindex,follow, and that's exactly how many appear in the index. If it's really good, useful stuff you should definitely be able to get indexed.