Questions
-
If i disallow unfriendly URL via robots.txt, will its friendly counterpart still be indexed?
Yeah, if you could solve this via .htaccess that would be great, especially if you have link equity flowing into any of those URLs. I'd go one step further than Irving and highly recommend canonical tags on those URLs. Since, as you said, it's all one page with infinite URL possibilities, the canonical should be easy to implement. Best of luck!
Intermediate & Advanced SEO | | Cyrus-Shepard0 -
Ever Wise to Intentionally Use Javascript for Global Navigation?
I had this same conversation with someone yesterday about a very similar set-up. In a 2009 blog post, Matt Cutts said that the main reason not to include over 100 links was for user experience. It used to be for technical reasons, but those no longer apply. Here is the post: http://www.mattcutts.com/blog/how-many-links-per-page/ Lots of links can lead to lots of code, which slows things down. It will also be dividing up the page rank fairly heavily. However in the age of mega-menus I don't think that the number of links is, in itself, a problem. Just for reference (and the answers to your situation may be different) our conversation about this ended with the decision to reduce the number slightly - structuring to leak less page rank to unimportant pages. However overall we still have a LOT of links and are happy with that.
Intermediate & Advanced SEO | | matbennett0