The Moz Q&A Forum

    • Forum
    • Questions
    • My Q&A
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Q&A Forum
    2. Categories
    3. Intermediate & Advanced SEO
    4. Pages are Indexed but not Cached by Google. Why?

    Pages are Indexed but not Cached by Google. Why?

    Intermediate & Advanced SEO
    44 4 19.4k
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • friendoffood
      friendoffood @Travis_Bailey last edited by

      Travis,

      First of all, I absolutely appreciate all the time you are taking to address my issues here.  Second of all, it IS very tempting to join you and any others here to go build houses or do something else, especially given the last few days..:)

      Ok. I'll try to keep it short:

      I wasn't thinking you had any bearing on my site going down, but that maybe there was a 'moz effect'.  hope not.

      Re: Chrome Incognito Settings:
      I'm really worried now that there is a sessions problem since anyone with cookies allowed should have the session ID saved between pages--in which case they would have only 1 entry in my 'user' table and you had 3 in a short amount of time.  That's why it thought you were a robot.  I don't know how to duplicate the problem though because I've never had it personally, and I use a program that connects to other machines with all kinds of combinations of operating systems and browsers and computers and have never had this problem with those.  It's my problem.  I'll have to figure it out somehow.  I have many session variables and it would be a huge overhaul to not use sessions at this point.  If you have any ideas (I'm using php) I'm all ears.

      re:Fun w/ Screaming Frog:  
      The IP for the 8.5 hour later instance was the same as your first one. Yet, if you were spoofing it shouldn't have said screaming frog in the user agent, right?  It was in my 'bot-stopped' file as instantly stopping because it was an unexpected bot.  So, confused unless perhaps you tried it separately from running with the spoof?

      <<normally screaming="" frog="" would="" display="" notifications,="" but="" in="" this="" instance="" the="" connection="" just="" timed="" out="" for="" requested="" urls.="" it="" didn't="" appear="" to="" be="" a="" connectivity="" issue="" on="" my="" end,="" so...="" yeah...="">>
      Ok.

      Fun w/ Scraping and/or Spoofing:
      <_>
      I'll have to check into it.  I've run Yslow and Gtmetrix without problems.  I see you tried to run it on the ferguson page and the home page.  I just ran the ferguson page in gtmetrix - which uses Page Speed Test (Google?) and Yslow both - and it ran ok, although not a great grade.

      <<while i'm="" running="" off="" in="" an="" almost="" totally="" unrelated="" direction,="" i="" thought="" this="" was="" interesting.="" apparently="" bingbot="" can="" be="" cheeky="" at="" times.="">>  That is interesting.

      I'm worried now most about the session issue, as that may be affecting a lot of my users and I've assumed multiple entries were from robots that generally don't keep sessions between page crawls (actually quite a few of the seo crawlers do -- but google, bing, yahoo don't).  If you are ok with going to my home page without incognito and clicking on a few pages and letting me know the first part of your IP when you do that it might really help me. Your shouldn't be blocked anymore (it lasts 1 day). But, no worries if you're ready to move on.</while>_</normally>

      Sorry, wasn't so short after all.  Thanks again.  Ted

      Travis_Bailey 1 Reply Last reply Reply Quote 0
      • RyanPurkey
        RyanPurkey last edited by

        You're welcome Teddy. Something that goes undermentioned when SEOs run very precise tests on specific page side changes is that they're typically doing them on completely brand new domains with non-sense words and phrases because of the chance that their manipulations might get the site blacklisted.There's no loss to them if that happens other than unanswered questions. If the site does survive for a bit maybe they'll learn a few new insights.  This level of granular, on site testing isn't a practical method for legitimate, public facing sites.

        When it comes to sites that serve a business function aside from testing possible granular ranking changes, you're going to be much better served by measuring your changes against your user interaction instead of your search engine rankings. In that vein, design and test for the users on your site, not the search engines. If your site is getting visits but none of your business goals are being met, go nuts on testing those factors. Split test, iterate, and improve things with the focus of better conversions. Dive deep into services like Optimizely and research by the likes of Conversion Rate Experts.  Use focus groups and usability testing to see how the minutiae of your changes affects interaction. You can go as extreme as you want to in that regard.

        Most importantly, the bulk of search engine ranking strength comes from external factors: the number and variety of sites linking to your site, the quality of sites linking to your site, the trust and high reputation of sites linking to your site, the semantic agreement of sites linking to your site, etc.  These factors are going to have many times greater influence in your ranking than your onsite tweaks in most cases.  If your site is functional and complies with Google's own guidelines (http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf) you've covered the bulk of what you need to do on site.  Focus instead on off site factors.

        The site: search function exists mostly to provide searchers the ability to find a piece of information on a given domain. For example, let's say a reporter wants to cite an NBA stat from NBA.com, they'd use "stat thing" site:nba.com as a search.  For users, that's useful in searching specifics, and for Google that makes them look all the better at "categorizing the world's information."  Cache demonstrates the amount of information Google has archived and how quickly it's available. Back in the day--story time--Google used to advertise how quickly and how broadly they indexed things.  In fact, they still do!  If you look at a search result you'll notice a light gray statistic at the very top that says something like, "About 125,000,000 results (0.50 seconds)" for a search about hot dogs for example.  This is Google saying, "We're BIG and FAST." The precise details of your site are way down the list to Google's own story of being big and fast.

        If you focus your efforts in off site optimization, linking with other reputable sites, and building your network you'll be way better served because you'll be getting referral traffic as well as lift in Google.  Cheers!

        friendoffood max.favilli 2 Replies Last reply Reply Quote 2
        • Travis_Bailey
          Travis_Bailey @friendoffood last edited by

          I'll PM my public IP through Moz. I don't really have any issue with that. Oddly enough, I'm still blocked though.

          I thought an okay, though slightly annoying, middle ground would be to give me a chance to prove that I'm not a bot. It seems cases like mine may be few and far between, but it happened.

          It turns out that our lovely friends at The Googles just released a new version of reCAPTCHA. It's a one-click-prove-you're-not-a-bot-buddy-okay-i-will-friend-who-you-calling-friend-buddy bot check. (One click - and a user can prove they aren't a bot - without super annoying squiggle interpretation and entry.)

          I don't speak fluent developer, but there are PHP code snippets hosted on this GitHub repo. From the the documentation, it looks like you can fire the widget when you need to. So if it works like I think it could work, you can have a little breathing room to figure out the possible session problem.

          I've also rethought the whole carpenter/mason career path. After much searches on the Yahoos, I think they may require me to go outside. That just isn't going to work.

          friendoffood 1 Reply Last reply Reply Quote 1
          • friendoffood
            friendoffood @Travis_Bailey last edited by

            Sorry about your being blocked.  A day hadn't passed due to the timing of your second visit--sorry.  I just changed the ip address in the table so you aren't blocked now.

            Ok, I think I figured out what happened.  You first went to the ferguson site.  You may clicked on something but the same page was reloaded.  Then in a different tab you clicked on my home page from a google search results page.  Then in a third tab you went directly to my home page.  Then you ran screaming frog and the program stopped it without a message, seeing the word 'spider' in the useragent.  Then you tried it again and it recognized that as a stopped bot and gave the message about suspicious activity.

            The program wipes out sessions and cookies when a user goes to the home page (it's not even a link anywhere) since that is just a location-choosing page, so when you opened it in a different tab the sessions were wiped out.  It had nothing to do with you being in incognito or not having cookies allowed.

            Does this sound like what you may have done, and in sequence?

            That's what it looks like, and that, if correct, is a huge relief for me since that is not usual user activity.  (Although I may have to reconsider whether its still a poor approach).

            I don't know about what happened with your second visit and the timeout.  curious that you got some 60 pages crawled or so--I don't suppose you have anything that would tell me the first 3 of those so I can look into why it timed out?  The table isn't keeping the ip on crawling so I can only look those up by the url crawled and the time.

            max.favilli Travis_Bailey 2 Replies Last reply Reply Quote 0
            • friendoffood
              friendoffood @RyanPurkey last edited by

              What a great answer Ryan!  Thanks.  I'll tell you what my concern is.  As a coupon site I know that users don't want a bunch of wording at the beginning of pages.  They just want to find the coupon and get it, but from what I've read Google probably would reward the site more if there was beefier wording in it instead of a bunch of listings that are closer in some ways to just a bunch of links, resembling a simple link page.  I also have a 'mega-menu' on some of my pages which I think is pretty user friendly but have read that Google might not know for sure if it is part of the page content or not, as some forums I found talk about how their rankings improved when they simplified their menus.  Lastly, I have a listing of location links at the top of the page for users to 'drill down' closer to their neighborhood.  This is just about the first thing Google sees and may again be confusing to Google as to what the page is all about.

              So IF the lack of 'wording content' and the early location of menu-type content is making my site hard to figure out from Google's perspective, I have alternatives and thought I could test those with Google ranking.  For example, I can enter wording content early on so as to 'beef' up the page so that it isnt just a bunch of coupon offer links.  I also could ajax the stuff that is above the 'coupon content' so that Google doesn't read that and get confused, and then put the actual links for Google to read down at the bottom of the page.  Both of those would be moves soley to satisfy Google and with no effect on the user  Google isn't perfect and i don't want to be penalized on ranking as a result of not addressing Google's 'imperfections', as it seems every edge counts and being on page 2 or 3 just isn't quite good enough.  I view this as reasonable testing rather than devious manipulation, but of course what matters with Google ranking is what Google thinks.

              So in these cases the user response will be neutral -- they generally won't care if I have wording about what is on the page (especially if most requires clicking on 'more info') or am ajaxing the menu information--they again just want to find coupons.  But, if Google cares, as I have read they do, then it would be nice to be able to verify that with some simple tests.  It may be that my issues are somewhat unique as far as the typical webpage is concerned.

              Having said all of that I do think your advice makes a ton of sense as the user is really what it is all about ultimately.

              Thanks very much, and I'm giving you a 'good' answer as soon as I hear back!

              RyanPurkey 1 Reply Last reply Reply Quote 0
              • max.favilli
                max.favilli @RyanPurkey last edited by

                Ryan, I don't agree. It's true external factors (in other words backlinks) nowadays have the biggest impact, but on-page optimization as far as my little experience tell, still does affect ranking and it's worth working on.

                And if we don't keep track of changes on pages and change on ranking how can we know what is working and what is not?

                Especially since there's no gold rule and what works for one site doesn't necessarily work for another.

                To make some example, I had a page which was ranking in position 1 for a search query with a volume of 50+k and very high competition. I expanded content to improve ranking for some additional queries, and it worked, it climbed from 2nd and 3rd serp page to 1st for a couple of those queries (I use both Moz ranktracker, semrush, and proracktracker to monitor ranking).

                Unfortunately ranking for the search query with the highest volume moved from position 1 to postion 2, I changed the content a little bit, to add some keyword, which made sense because was re-balancing the keyword density now that the content was bigger. And in 24 hours it got back to position 1, without damaging the other search query improvement.

                **In many other cases, I improved ranking on pages without any backlink, just improving the content, and I am talking about business critical pages with a high competition.

                So I would say on-page optimization is still worth spending time on, to test the effect of the changes is a must and to monitor google ranking fluctuation is a must too.

                Of course I am not saying off-page optimization is not important, is fundamental, I am giving that for granted.**

                RyanPurkey 1 Reply Last reply Reply Quote 1
                • max.favilli
                  max.favilli @friendoffood last edited by

                  Let me say straight forward, all that bot blocking is not a good idea.

                  I have been there in the past few times, especially for e-commerce, scraping to compare prices is very common, and I tried blocking scrapers many times, maybe I am not that good, but at the end I gave up because the only thing I was able to do was annoy legitimate users, and legitimate bots.

                  I do scrape other website too for price comparison, tens of websites, since I don't want to be blocked I split the requests among different tasks, I add a random delay between each request, I fake header data like user agent pretending to be Firefox from a windows pc, and I cycle through different proxies to continuously change IP address.

                  So as you can see, it's much harder to block scrapers than it seems.

                  Neither I would use JS to block cut&paste. I have no data to base my judgement on. But it's annoying for users, it doesn't sound compliant with accessibility, it stinks and google usually doesn't like things which stinks. Plus... If someone wants to scrape your content you are not going to block him that way.

                  friendoffood 1 Reply Last reply Reply Quote 1
                  • RyanPurkey
                    RyanPurkey @friendoffood last edited by

                    Thanks. There are two main parts to Google "figuring out" your site. One: indexation. That's been solved.  We know that you're getting indexed.  Two: ranking. Your site being so new and young, it's going to need backlinks and network growth to experience dramatic ranking changes.  If your menu isn't causing your site to be indexed poorly and your pages are being counted as unique enough, then you're ok now there as well.  The next most important step is getting your domain trust and authority up.

                    friendoffood 1 Reply Last reply Reply Quote 0
                    • RyanPurkey
                      RyanPurkey @max.favilli last edited by

                      Hi Massimliano.  I would disagree with myself if I was talking about your site too... ;^)  But in this specific case, qjamba.com is a site that needs the fundamental quality of backlinks more so than it needs Teddy to write a bot that is constantly pinging Google in order to try and decipher the incremental on-site changes he's making.  I'm speaking to his need to prioritize that aspect of optimization.  Copying what an SEO does when creating a nonsensical site with gibberish words in order to test on-page optimization as purely as possible with a normal, public facing website is a bad idea.

                      Obviously on-page optimization is important, but again, in this specific example, Teddy isn't even discussing his keyword rankings, rather he was looking to go down an on-site optimization path that might make him more and more frustrated instead of bringing about much more positive results.  Cheers!

                      friendoffood max.favilli 2 Replies Last reply Reply Quote 0
                      • friendoffood
                        friendoffood @RyanPurkey last edited by

                        Some of my pages are on Google's page 2, 3 and a few on page 1 for certain search terms that don't have a lot of competition but that I know SOME people are using (they are in my logs). and those pages have virtually no backlinks.  I want to boost those on page 2 or 3 to page 1 as quickly as possible because p1 is 10x or more better than p2. Time/Cost is an issue here:  I can make changes overnight at no cost as opposed to blogging or paying someone to blog.

                        Because domain authority and usage takes so long, it seems worth tweaking/testing NOW to try to boost certain pages from p2 or 3 to page 1 virtually overnight as opposed to waiting for months on end for usage to kick in. I don't know why Google would penalize me for moving a menu or adding content--basically for performing SEO on page, so it would be nice to be able to figure out what tools (cached pages, site:www. GWT, GA or otherwise) to look at to know if Google has re-indexed the new changes.

                        Of course, the biggest pages with the most common search terms probably HAVE to have plenty of backlinks and usage to get there, and I know that in the long run that's the way to success overall when there is high competion, but it just seems to me that on page SEO is potentially very valuable when the competition is slimmer.

                        1 Reply Last reply Reply Quote 0
                        • friendoffood
                          friendoffood @RyanPurkey last edited by

                          I think there's been a misunderstanding.  I'm not writing a bot. I am talking about making programming changes and then submitting them to Google via the fetch tool to see how it affects my ranking as quickly as possible, instead of waiting for the next time Google crawls that page -- which could be weeks.  I think the early reply may have given you a different impression.  I want to speed up the indexing by fetching in Google the pages and then look to see what the effect is.  My whole reason for starting this thread was confusion over knowing how to tell when it was indexed because of unexpected results (by me) with the cache and site:www... on Google.

                          RyanPurkey 1 Reply Last reply Reply Quote 0
                          • RyanPurkey
                            RyanPurkey @friendoffood last edited by

                            Great! Well you have lots of insights here. It sounds like you're ready to test in the near term, and build up the domain in the long term.  Good luck!

                            friendoffood 1 Reply Last reply Reply Quote 0
                            • friendoffood
                              friendoffood @RyanPurkey last edited by

                              Well, I'm ready to test -- but still not quite sure how since I don't know how to tell when Google has indexed the new content, since sometimes it doesn't get cache'd and sometimes it disappears from the site:www.. listing.  I've read it only takes a couple of days after Google crawls the page, and can go with that, but was hoping there is a way to actually 'see' the evidence that it has been indexed.

                              So, while I've gotten some great input, I am somewhat unsatisfied because I'm not sure how to tell WHEN my content has really been put in the index so that the algorithm is updated for the newly crawled page.

                              RyanPurkey 1 Reply Last reply Reply Quote 0
                              • RyanPurkey
                                RyanPurkey @friendoffood last edited by

                                Ah, that answer really varies per website.  For example, if you're site is a major news site, Google's indexation is extremely fast, measured in seconds not days.  Even if you're not a news site, major sites (high domain authority) get crawled and indexed very rapidly. Since you're going to be testing your own changes you'll learn how long this takes for your particular site.

                                friendoffood 1 Reply Last reply Reply Quote 0
                                • friendoffood
                                  friendoffood @max.favilli last edited by

                                  Masimilliano, thanks for your input.  So you're on of them,huh? 🙂  Good points, the last thing I want to do is annoy users, yet I also want to track 'real' usage, so there is a conflict.  I know it is impossible to block all that I don't want as there is always another trick to employ..I'll have to think about it more.

                                  Yeah the cut and paste blocking is annoying to anyone that would want to do it.  But, none of my users should want to do it. My content is in low demand but I hate to make anything easier for potential competition, and some who might be interested won't know how to scrape.  Anyway thanks for your feedback on that too.

                                  max.favilli 1 Reply Last reply Reply Quote 0
                                  • friendoffood
                                    friendoffood @RyanPurkey last edited by

                                    I'm sorry, but once I know they have crawled a page, shouldn't there be a way to know when it has also been indexed?  I know I can get them to crawl a page immediately or nearly, by fetching it.  But, I can't tell about the indexing--are you saying that after they crawl  the  page, the 'time to indexing the crawled page' can vary by site and there really is no way to know when it is in the new index?  that is, if it shows as newly cached that doesn't mean it has been indexed too, or it can be indexed and not show up as a site:www... , etc..?

                                    RyanPurkey 1 Reply Last reply Reply Quote 0
                                    • max.favilli
                                      max.favilli @RyanPurkey last edited by

                                      Well, then I totally agree with you, Ryan, thanks for the answer. With a DA of 1, you are absolutely right.

                                      1 Reply Last reply Reply Quote 0
                                      • max.favilli
                                        max.favilli @friendoffood last edited by

                                        First of all, I was just browsing and I got blocked as bot see below:

                                        http://imgur.com/HoKqh97

                                        I would remove that cloaking.

                                        Second, understanding your visitors behavior is one of the most complex task, you don't know your user behavior until you run a lot of test, survey and so on...

                                        friendoffood 2 Replies Last reply Reply Quote 1
                                        • RyanPurkey
                                          RyanPurkey @friendoffood last edited by

                                          Yeup!  Indexing time varies.  You'll be able to tell the time between crawl and indexation by when Google shows your page version B in it's cache after you made changes from A, so if the 'example.html' page is already in Google's index you'll see this:

                                          You make changes on a page, example.html (version A is now version B)Google crawls example.html (version B)
                                          You check Google to see if example.html is version A or B in the cache
                                          no?
                                          no?
                                          no?
                                          no?
                                          yes. That's how long it takes.

                                          OR, you make a new page. It gets crawled. Checking if it's indexed... no, no, no, no, yes?! That's how long it takes.

                                          Again, this time period varies and having a site with excellent domain strength and trust usually makes it a shorter time period.  It also tends to influence how many pages Google decides to keep in its index or show to users.  Pretty much everything gets better for a site the stronger its domain authority and trust are.

                                          friendoffood 2 Replies Last reply Reply Quote 1
                                          • friendoffood
                                            friendoffood @max.favilli last edited by

                                            Thanks for sharing that.  I was only kidding above, but obviously it's no joking matter when a user gets blocked like you did.

                                            I just looked and see that it blocks when something/someone clicks 3 times within 30 seconds. EDIT: but that's only if it isn't keeping the session between clicks--see next post

                                            1 Reply Last reply Reply Quote 0
                                            • 1
                                            • 2
                                            • 3
                                            • 2 / 3
                                            • First post
                                              Last post
                                            • Does Google cache every page that is been indexed?
                                              donsilvernail
                                              donsilvernail
                                              0
                                              2
                                              73

                                            • Google is indexing wrong page for search terms not on that page
                                              katemorris
                                              katemorris
                                              0
                                              6
                                              1.1k

                                            • Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
                                              N1ghteyes
                                              N1ghteyes
                                              0
                                              3
                                              128

                                            • Big discrepancies between pages in Google's index and pages in sitemap
                                              David-Kley
                                              David-Kley
                                              0
                                              6
                                              218

                                            • Why are some pages indexed but not cached by Google?
                                              john4math
                                              john4math
                                              0
                                              2
                                              2.9k

                                            • Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page?
                                              Muhammad_Jabali
                                              Muhammad_Jabali
                                              0
                                              3
                                              748

                                            • Merging your google places page with google plus page.
                                              junkcars
                                              junkcars
                                              0
                                              3
                                              518

                                            • Page Indexed but not Cached
                                              Matt-Williamson
                                              Matt-Williamson
                                              0
                                              2
                                              774

                                            Get started with Moz Pro!

                                            Unlock the power of advanced SEO tools and data-driven insights.

                                            Start my free trial
                                            Products
                                            • Moz Pro
                                            • Moz Local
                                            • Moz API
                                            • Moz Data
                                            • STAT
                                            • Product Updates
                                            Moz Solutions
                                            • SMB Solutions
                                            • Agency Solutions
                                            • Enterprise Solutions
                                            • Digital Marketers
                                            Free SEO Tools
                                            • Domain Authority Checker
                                            • Link Explorer
                                            • Keyword Explorer
                                            • Competitive Research
                                            • Brand Authority Checker
                                            • Local Citation Checker
                                            • MozBar Extension
                                            • MozCast
                                            Resources
                                            • Blog
                                            • SEO Learning Center
                                            • Help Hub
                                            • Beginner's Guide to SEO
                                            • How-to Guides
                                            • Moz Academy
                                            • API Docs
                                            About Moz
                                            • About
                                            • Team
                                            • Careers
                                            • Contact
                                            Why Moz
                                            • Case Studies
                                            • Testimonials
                                            Get Involved
                                            • Become an Affiliate
                                            • MozCon
                                            • Webinars
                                            • Practical Marketer Series
                                            • MozPod
                                            Connect with us

                                            Contact the Help team

                                            Join our newsletter
                                            Moz logo
                                            © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                            • Accessibility
                                            • Terms of Use
                                            • Privacy