AJAX & JQuery Tabs: Indexation & Navigation
-
Hi
I've two questions about indexing Tabs.
1. Let's say I have tabs, or an accordion that is triggered with Jquery. That means that all HTML is accessible and indexed by search engines. But let's say a search query is relevant to the content in Tab#3, while Tab#1 is the one that's open by default. Is there any way that Tab#3 would be open directly if it's more relevant to the search query?
2. AJAX Tabs: We have pages that have Tabs triggered by AJAX (example: http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.html). I'm wondering about the current best practice. Google recommends HTML Snapshots. A newer SEOMoz Article talks about pushState(). What's the way to go here?
Or in other words: How to get Tabs & Accordion content indexed and allow users to navigate directly to it?
-
I would do some serious decoding
if you want to see what Google sees this as a great tool http://www.screamingfrog.co.uk/seo-spider/
see what screaming frog tells you then try to fix it in Unicode 8
can fix some of this by going to the bottom of this page and changing your robot.txt.
https://developers.google.com/webmasters/ajax-crawling/docs/learn-more
https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt
you have too many parameters http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.html
in my opinion you forgot some serious code changing To do.
if you want to use one of the tools Google recommended you'll need I-frame crawling and you can do that with this along with the 2nd URL
https://github.com/crawljax/crawljax/blob/master/CHANGELOG.md
http://code.google.com/p/selenium/issues/detail?id=387
http://www.unicode.org/faq/utf_bom.html
because it is Java based there are some great tools found here as well
I wish you the best and hope that this is helpful,
Thomas
-
I forgot to add this but this is very relevant the software could make your script is Seo friendly
-
Hi Thomas
Thanks for the resources. I'll have to check with IT which solution seems most practical.
Though I don't understand two points:
- Where does robots.txt come into the game?
- How do we have too many parameters in http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.htm? Can you specify this?
Thanks!
/Philipp
-
hello Philip,
robots.txt file allows you to tell the web bots that crawl your site what is a link and what is not a link that you want to show to the world and to Google
Most search engines will analyze and follow a link only if it contains three query string parameters or fewer.
many parameters in the link shown you have 5 parameters they are what come after the1st / as shown below you have 5. You can block off certain parameters with robots.txt
/en/residential/help/loesung/entfernen-sie-sim-lock
http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.htm
for some reason whenever I go to your link you have posted I get this error
The requested URL /system/sling/cqform/defaultlogin.html was not found on this server.
http://msdn.microsoft.com/en-us/library/ff723936(v=expression.40).aspx
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
please links above regarding URL parameters and Microsoft and Google agree that too many parameters and they will not search the link.
I hope this is been helpful
sincerely,
Thomas