Google W.M.T Missing Meta Title on AJax Pages... Weird!!
-
Hey Mozers,
I was looking through my Google Web Masters Tool under HTML Improvements. It looks like I have 2,200 pages missing Meta Titles and I was about to lose it thinking HOW COULD THIS HAPPEN! I came to realize that the pages were "Ajax Pages". This is specifically a checkprice pop up and I dont want this page crawled by google. It looks like to google I have over 2k pages missing Meta Titles and they are all "check price pop ups". How would you suggest I block this. I thought about going the easy route and removing the subfolder and putting it in the Robots.txt document and I'm scared of that because we use AJax for a bunch of calls. I'm also scared of putting in the head <metaname="robots" =="" noindex,nofollow"="">because it requires hard coding</metaname="robots">
I Know i'm not the first to come across this issue, Any Ideas??
-
Hey Rodrigo, some additional questions you try and help here.
- Are these the only files getting loaded within their subfolder?
- What do the price files look like? Full HTML file from to ? Just a section of code like a div?
- Why are the prices stored in static files? Are the prices in a database? Could this popup be accomplished with a single file template that loads a price from the database? If so, then the hard-coding problem becomes much simpler.
Feel free to share URLs if you are able, or DM them.
Also, if hard-coded, you'd want to use . There's no reason to use nofollow in the meta robots tag in the majority of cases.
-
Hey Kane,
Thank you for answering. According to Webmaster tools There are over 2,200 of these Check price pop ups. They happen when a customer clicks on "Add to cart" which actives the ajax popup which asks the customer to enter a zip code to check the price.
The source code of this pop is very small I almost feel like there is no
** Example URL:**
I was thinking of putting "Disallow: /checkprice" under our robots.txt
-
Hi Rodrigo,
OK - so I was able to find the page functionality in question on your actual site just to double check what was going on.
Since this content isn't important to the page from an SEO standpoint, it makes sense to just remove these pages from the index.
To do that your best bet is probably robots.txt. Here's a good stackexchange with John Mueller confirming that.
I believe any one of the following entries can do the trick, but highly recommend you double check my work and test this before you implement it. This is also a good time for me to add a liability disclaimer if you accidentally noindex your whole site.
- Disallow: /AjaxPages/PopUp/CheckPrice
- Disallow: /AjaxPages/PopUp/CheckPrice.aspx
- Disallow: /CheckPrice
- (choose one of these, you don't need to use all of them)
Pretty sure Robots.txt is case sensitive, so /checkprice may not work.
There is also something called an X-Robots-Tag that could do this instead of your robots.txt file. Instructions on that at https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?hl=en
Finally, consider testing this on your staging server before rolling it out live.