Why is my site not getting crawled by google?
-
Hi Moz Community,
I have an escort directory website that is built out of ajax. We basically followed all the recommendations like implementing the escaped fragment code so Google would be able to see the content. Problem is whenever I submit my sitemap on Google webmastertool it always 700 had been submitted and only 12 static pages had been indexed. I did the site query and only a number of pages where indexed.
Does it have anything to do with my site being on HTTPS and not on HTTP?
My site is under HTTPS and all my content is ajax based.
Thanks
-
Why build it with Ajax, ajax is good for functionally that needs to load seamlessly but not good for content.
Using escaped fragment seems to not work well as many are having problems getting indexed.Does your content have to be loaded via ajax? why not load it on the page? would be much simpler
-
Hi There
As Alan mentioned HTML is going to be a much more guaranteed way to get indexed. The HTTPS alone shouldn't be affecting anything. But do you have a different robots.txt for https and http? Is the https one blocking crawlers? Do you have the https version of the site registered in webmaster tools? When you go to crawl stats, how many pages does it show that they are crawling?
-Dan
-
Sean, What is the URL for your site?