What's going on with google index - javascript and google bot
-
Hi all,
Weird issue with one of my websites.
The website URL: http://www.athletictrainers.myindustrytracker.com/
Let's take 2 diffrenet article pages from this website:
1st: http://www.athletictrainers.myindustrytracker.com/en/article/71232/
As you can see the page is indexed correctly on google:
http://webcache.googleusercontent.com/search?q=cache:dfbzhHkl5K4J:www.athletictrainers.myindustrytracker.com/en/article/71232/10-minute-core-and-cardio&hl=en&strip=1 (that the "text only" version, indexed on May 19th)
2nd: http://www.athletictrainers.myindustrytracker.com/en/article/69811
As you can see the page isn't indexed correctly on google:
http://webcache.googleusercontent.com/search?q=cache:KeU6-oViFkgJ:www.athletictrainers.myindustrytracker.com/en/article/69811&hl=en&strip=1 (that the "text only" version, indexed on May 21th)
They both have the same code, and about the dates, there are pages that indexed before the 19th and they also problematic. Google can't read the content, he can read it when he wants to.
Can you think what is the problem with that? I know that google can read JS and crawl our pages correctly, but it happens only with few pages and not all of them (as you can see above).
-
Hi there
I would take a look at the Fetch as Google tool in your Search Console and see what issues arise there - I would do this for both your desktop and your mobile, so that you can see how these pages are being rendered by Google.
If you get a "Partial" status, Google will return the issues that they have ran into, and you can prioritize your issues & how you want to handle them from there.
You can read more about Javascript and Google here as well as here.
Hope this all helps! Good luck!
-
Hi Patrick,
We already tested all the pages with fetch as Google tool, sorry that I didn't mention is before but everything over there is ok. I see the 'Partial" status, but the issues are with one of the social plugins and without any connection to the content.
So, all the tools show that it should be ok, but google isn't indexing correctly the pages.
I already checked:
1. Frontend code.
2. No-index issues
3. Canonical issues
4. Robots.txt issues
5. Fetch as Google issues
I know that google can read JS, and I don't understand why he can read only part of the pages and not all of them (there isn't any difference between them).
-
Hello Or,
I just checked the most recent cache and it looks like Google does NOT see the content on the first URL (ending in /71232/) but does see it on the second one (ending in 69811).
This is the opposite of the situation you described above.
Yes, Google "can" execute Javascript, but just because they can doesn't mean they will every time. Also, perhaps not all of their bots can or do execute Javascript every time. For instance, the bot they use for pure discovery may not, while the one they use to render previews may.
Or they could have given the Javascript only so long to execute.
I also notice the page that is currently not indexed fully has an embedded YouTube video. Not that this would typically cause any problems with getting other content indexed, in your case it may be worth looking into. For example, it could contribute to the load time issue mentioned above.
When it comes to executing scripts, submitting forms, etc... Google is very much at the stage of just randomly "trying stuff out" to "see what happens". It's like a hyperactive baby in a spaceship just pushing buttons like crazy, which is why we run into issues with "spider traps" and with unintentionally getting dynamic pages indexed from form submissions, internal searches and other oddities in site architecture. It is also one of the reasons why markup like Schema.org and JSON-LD are important: They allow us to label the buttons so the bot "understands" what it is pressing (or not).
I apologize that there is not definitive answer for your problem at the moment, but given the behavior has switched completely I'm not sure how to go about investigating. This is why it is still very much a best practice to ensure all of your content is indexable by not rendering it with Javascript. If you can't see the textual content in the source code (as is the case here) then you are at risk of it not being seen by Google.