Questions
-
Anyone else notice a traffic drop around June 28th 2016?
There was noticeable algorithmic activity in early June and again around June 27th (right in your timeline). No updates were confirmed, but Glenn Gabe did some analysis around a "Panda-like" update: http://www.gsqi.com/marketing-blog/june-2016-google-algorithm-update/ Glenn tracks a set of sites hit by previous penalties (including Panda and Penguin), and while he, like all of us, is forced to speculate, is data is generally pretty solid, IMO.
Search Engine Trends | | Dr-Pete0 -
Should I connect my GA account to SEMRush?
Hello, Thanks for asking. But, there's no need to worry! SEMrush will not share your data anyone, especially your competitors. That wouldn't be right. All of our traffic and ad spend figures are derived from publicly available sources or arrive to us from a trusted third-party. I hope this has been helpful.
Online Marketing Tools | | DavidBlack0 -
Split test content experiment - Why can't GA identify a winner here?
Hey Emeka, Short Answer: You're correct. Effectively what Google is saying here is that they don't have enough statistical confidence to definitely tell you that the variation is outperforming the original at a 95% confidence level, but they do at a 93.8% confidence level. Quick Note: 95% is the lowest setting in GA Experiments. Long Answer: The math behind this statistical significance calculation is: Full credit to vwo.com for their A/B Testing Significance Calculator & doing all the work here. Link to Image One - This is simply the data of the Control vs Variation & the Conversion Rate & Standard Error Rates Conversion Rate is: Conversions/Sessions. Standard Error is: √(Conversion Rate*(1-Conversion Rate)/Visitors) Link to Image Two - Confidence Levels, Z-score, & P-value To find if something is truly significant at a specific confidence level, we need to calculate the Z-score then use that value to find the P-value and from there we can determine the confidence level. Z-score is: (Control Conversion Rate-Variation Conversion Rate)/√((Control Standard Error^2)+(Variation Standard Error^2)) For the P-value, we need to calculate the normal distribution of the z-score with a mean of 0 and a standard deviation of 1. The easiest way to do this is to use an online tool, here's a link to your specific example. Finally, we take the Confidence % expressed as a decimal (i.e. 0.90, 0.95, 0.99) and 1 minus these values (i.e. 0.1, 0.05, 0.01) If the P-value is greater than the Confidence % or less than 1 minus the Confidence %, then it is significant, otherwise it is not. Let me explain that using our example: at 95% confidence, our P-value needs to be <0.05 or >0.95. Since our P-value is .05799, it doesn't fit either of those requirements and such is not significant at that confidence level. I know that's a lot of math, but this is why Google Experiments is saying that the result is not statistically significant. Hope this helps! Let me know if you have any further questions on this! Trenton
Conversion Rate Optimization | | TrentonGreener0 -
Implementation advice on fighting international duplicate content
Unfortunately yes, it is needed to be rerun the process with the tool.
Local Website Optimization | | gfiorelli10 -
Can hotlinking images from multiple sites be bad for SEO?
Sorry, hotlinking was the wrong word to use, we're actually just embedding the images. Is it possible that Google recognises that spammy sites (as an example) tend to embed lots of images and therefore use it as an indicator of spam? Also, is poor netiquette ever taken into account? Again, maybe because Google is trying to find spammy sites? For the record, it is something we'll be fixing (especially from a copyright point of view), but we're trying to prioritise this. If there's a potential SEO impact, we'll sort it quick, if not, then we'll do more pressing things first.
Technical SEO Issues | | OptiBacUK0 -
If Google doesn’t know we’re hosted in the UK, does that affect our SERPs?
Hi Des, Thanks, I didn’t know that’s why we were assigned a “special crawl rate” in WMT. Could the special crawl affect our ranking? For example Google puts a lot of weight on freshness, so if Google is crawling us less (we can’t tell if it’s more or less than before), could this make our site look less fresh? We have really tried our best to rule out all other possibilities. Our content is much better and more frequent than it was before and our link building is natural and gradual. We’ve also looked at over optimisation and our competitors. Our competitors are Wikipedia, a couple of national UK newspapers, Harvard, a medical encyclopaedia and a single American competitor. We’re the first UK company to appear the in the SERPs. Whilst these are obviously very big companies, none of them (with the exception of the American company) targets the keyword as much as our website does. Incidentally we did come back up to 4th yesterday but we’ve already a dropped a place today so it doesn’t look like it’ll last. The other thing we found really strange is that the singular version of our keyword didn’t drop at all and has stayed very stable; it’s just in the plural keyword that we dropped. The vast majority of our anchor text is using the plural version (it’s in our brand name) and the domain also contains the plural version. Was there an algorithm change around that time, or maybe are we over optimising the plural keyword? (Is that even possible?) Thanks James
Search Engine Trends | | OptiBacUK0 -
Does Google take email server IP blacklists into account?
I have never read any evidence that would support a blacklisted mail server to influence rankings. Many websites also share an ip address, and to penalize one would penalize all. However, banning sites based on a shared ip address does occur. So in short, I would say you shouldn't do it to be safe. Good read: http://www.seomoz.org/ugc/the-penguin-update-how-google-identifies-spam
Web Design | | KevinBudzynski0 -
Can a "Trusted Retailer" badge scheme affect us in the SERPs?
Would putting nofollow and noindex on the FAQ itself make a difference? That should make it obvious to Google that we don't want any of the link juice. I think that is a good idea. That should eliminate risk with google and ease concerns of affiliates who think like me.
White Hat / Black Hat SEO | | EGOL0 -
How does badly formatted HTML affect SEO?
Great, thanks for the info. I always thought Google was really hot on compliance, but good to know there is a bit of leeway.
Intermediate & Advanced SEO | | OptiBacUK0 -
Can I point some rel alternate pages to a 404?
Hi Greg, Forgot to say thanks! We ended up taking your advice and we've made sure each page uses rel alternate correctly. Took a bit of development time, but I'm sure it's worth it in the end. Cheers, James
International Issues | | OptiBacUK0