Random Google?
-
In 2008 we performed an experiment which showed some seemingly random behaviour by Google (indexation, caching, pagerank distributiuon).
Today I put the results together and analysed the data we had and got some strange results which hint at a possibility that Google purposely throws in a normal behaviour deviation here and there.
Do you think Google randomises its algorithm to prevent reverse engineering and enable chance discoveries or is it all a big load balancing act which produces quasi-random behaviour?
-
I believe there are two many other factors that are possible that could have prevented that test from being accurate. How do you know 100% for sure that there are no variables you are overlooking?
-
I wonder if those PR5 "anomalies" and the PR3 one really are not "anomalies" due to other ranking factors as external backlinks to those pages.
-
I strictly monitored that there are no inbound links for example and the link profile was clean until public TBPR update at least. Once suspicion I have and was not able to confirm and include in my research was that there was a plain link added to that odd page which has PR3 and it shouldn't. That would basically prove the theory that any external link fortifies the importance of the page to the point of higher indexation/caching. Something like internal link + external link = match made in heaven.
-
There were no detectable inbound links to any of those pages before the pagerank update. What other signals could have affected it... content quality alone? By the way content was written using the same writer and same quality level.