Questions
-
Blocking Google from telemetry requests
Hi Rogier, Yes, this is usually counting towards crawl budgets as Googlebot is doing this per request. It depends on how your request is being set up obviously, otherwise, I would advise going with the exclusion for the robots.txt that you're already heading towards. Hope this helps!
Technical SEO Issues | | Martijn_Scheijbeler0 -
Hide messenger for crawlers
In general, I don't think that this is a great idea. Although Google does meter out crawl-allowance, Google also wants a realistic view of the pages which it is crawling. Your attempt at easing the burden of Google's crawl-bots may be seen as an attempt to 'fake' good page-speed metrics, for example (by letting Google load the web-page much faster than end users). This could cause some issues with your rankings if uncovered by a 'dumb' algorithm (which won't factor in your good intentions) Your efforts may also be unrequired. Although Google 'can' fire and crawl JavaScript generated elements, it doesn't always do so and it doesn't do that for everyone. If you read my (main) response to this question, you'll get a much better idea of what I'm talking about here. As such, the majority of the time - you may be taking on 'potential' risk for no reward Would it be possible to code things slightly differently? Currently you state that this is your approach: "This means that we are actively adding javascript code which will load the Intercom javascript on each page, and render the button afterwards" Could you not add the button through HTML / CSS, and bind a smaller script to the button which then loads the "Intercom javascript"? I am assuming here that the "Intercom javascript" is the large script which is slowing the page(s) down. Why not load that script, only on request (seems logical, but also admit I am no dev - sorry)? It just seems as though more things are being initiated and loaded up-front than are really required Google want to know which technologies are deployed on your page if they choose to look, they also don't want people going around faking higher page-speed loading scores If you really want to stop Google wasting time on that script, your basic options would be: Code the site to refuse to serve the script to the "googlebot" user agent Block the script in robots.txt so that it is never crawled (directive only) The first option is a little thermonuclear and may mean you get accused of cloaking (unlikely), or at the least 'faking' higher page-speed scores (more likely). The second option is only a directive which Google can disregard, so the risks are lower. The down-side is that Google will pick up on the blocked resource, and may not elevate your page-loading speed. Even if they do, they may say "since we can't view this script or know what it does, we don't know what the implication for end-users is so we'll dampen the rankings a little as a risk assessment factor" Myself, I would look for an implementation that doesn't slow the site down so much (for users or search-bots). I get that it may be tricky, obviously re-coding the JS from Intercom would probably break the chat entirely. Maybe though, you could think about when that script has to be loaded. Is it really needed, on page-load, all the time for everyone? Or do people only need that functionality, when they choose to interact? How can you slot the loading of the code into that narrow trench, and get the best of both worlds? Sorry it's not a super simple answer, hope it helps
Technical SEO Issues | | effectdigital0