Interesting - I wonder how widespread this bug is.
Does it seem to be effecting your actual visits & unique visitors numbers, or just the charts?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Interesting - I wonder how widespread this bug is.
Does it seem to be effecting your actual visits & unique visitors numbers, or just the charts?
Actually, the screenshot I attached isn't helpful - because there are two separate Y-axes with different scales, which explains the discrepancy.
In the case of your screenshot, I'm at a loss. I'm wondering if something in your GA implementation has changed?
Thanks for the update, Conor - glad to hear you've sorted it.
Best,
Mike
Hi Irving,
As Yusuf suggested, .htaccess files are not publicly viewable - they're inaccessible by public users by default.
For your own site's .htaccess file, simply downloading the file via FTP and opening with Notepad/similar will work. But I take it you were looking for a tool to check the .htaccess file for public sites. If there is such a tool, it's news to me.
Best,
Mike
Hi Conor,
Sorry it's been so long without a response. This question was just assigned to me, but I haven't seen this issue before and can't offer any insight.
It's been a while now so you may have already sorted this.
If you're still stuck, let me know - I can flag this up to a wider group of people and see if anybody's seen this issue before. Worth a shot.
Best,
Mike
Hi Harriet,
Have you considered going GA Premium? It certainly isn't cheap $150k/yr, but the platform is proven and continues to improve. I don't know where comScore's pricing is at - I'm assuming it's less expensive than GA Premium.
Two alternatives I'd recommend looking into:
I haven't had the chance to use either of these first-hand yet, but I've heard good things, they're designed for publishers and the interface/reporting looks good.
I'd be interested to hear what you land on and why, if you care to circle back in this thread. 
Best of Luck,
Mike
Hi LW,
Sorry for the extreme delay here - the Q&A notification system went wonky for a bit and I never got the response message for this thread.
I'm sure you're passed this issue by now, but yes - Googlebot Mobile should just index the mobile version of the page.
Best,
Mike
Hi Harriet,
I've only seen this a couple of times in the past, and I was always been able to sort out what was up.
One thing I'd recommend is to make absolutely certain that the destination URL (and no other versions of it) is the only URL that's setting those UTM variables. In other words, ensure no additional pages are getting traffic for that campaign.
I'd also be curious to know how much traffic the site receives overall - if you're at the level where GA will actively sample your reports, I've seen sampling cause some wonky numbers, though it is usually proportional / within reason.
It's a few weeks now since you posted this question, not sure you're still vying with this, but if you're still stuck and can provide some more detail, I'll be happy to dig a little more.
Best,
Mike
I would +1 Wiliam's answer here. 301 redirects in addition to Bing's site moving function should do the trick. Particularly if the links that point to the old domain, which will be redirected via the 301, contain your brand name - I've seen this carry the branded search rankings over regularly.
Moz has chosen to keep their home page title reading "SEOmoz is now Moz" for several months now - I don't know first-hand, but I'd imagine they've done this in part as insurance to ensure they rank for their old brand name. Might be worth considering the same.
Hi Mario,
The Mozzers above are right - your best move is to get rid of the redundant /coastal/ subfolder.
Your developer is incorrect. There is no security benefit to the subfolder.
I would suggest, to ensure security, that you instead have a thorough read through this guide to Hardening WordPress and follow these methods, as they will cover you for the vast majority of hack attacks.
Also, keep regular backup copies of your database if your hosting company doesn't do this automatically. Between that and backups of the website files, you'll be well covered in the event of a hack.
Best,
Mike
Nice catch, Lynn. That's got to be (at least the majority of) the problem.
Hi Matt,
Majestic, Open Site Explorer and ahrefs are all showing zero links pointing to the entire domain, waydownunder.com.au.
I'm not suggesting this proves that you don't have enough links for Google to crawl/index the site, as I have also repeatedly seen Google index sites that don't have links yet. However, if these three major link indexes are showing zero links, there's a good chance Google's not discovering the site through regularly crawling as well.
Have you tried creating and submitting a sitemap via Webmaster Tools?
Best,
Mike
Hi Matt,
I would echo Lynn's recommendations here.
I doubt Google is actively filtering the 2nd site from search results (the duplicate content filter is employed scarcely, you'll find no shortage of duplicated sites that are indexed - it's also more of a results filter than an index filter, meaning duplicate content is still indexed, it just isn't shown in SERPs when the filter is active).
It's more likely that you simply haven't sent Google a strong enough ping that the site is worth indexing. Generate some marketing activity around the site, link to it from the current site as Lynn suggested (esp. with turning those pages into summaries), and I expect the site will show up in the index within a couple of weeks.
Best of Luck,
Mike
I've tested both and the logic looks fine.
However, as always with .htaccess changes, I'd recommend testing to verify this has been set up properly once you roll these out (easy enough with a quick ping / page load).
Hi Vince,
If there is such a tool, I haven't seen it.
Chris's suggestion to export/import csv files into Excel and combine there would probably be where I'd start, but I'm not sure how easy it'd be to work with the annotations in Excel. There's a Greasemonkey (Firefox) script to export them, but I haven't used this myself so I can't confirm how smoothly it works.
Mash-ups of multiple metrics on a single chart like this I rarely find to be helpful, as they become visually confusing quickly - but if this is exactly what your boss is looking for, you're probably going to end up creating something in Excel and replacing the data each time (but saving the visualization settings). You could also hack together a macro to handle the repeating and tedious tasks.
Sorry I don't have a better/shorter/easier answer for you here. If you do find or build a solution that is robust and/or elegant, I hope you'll head over to YOUmoz and share it with the community. 
Best of Luck,
Mike
To directly answer your question, Erin, a 301 redirect will indeed prevent users from accessing the https versions of your pages and is not the recommended approach.
Is there a reason you want to prevent users/search engines from accessing the https versions of your pages?
Simply ensuring that all links within the site navigation point to http versions, and setting a rel=canonical on all https versions back to the http versions, should do the trick.
Best,
Mike
Hi LW,
I'm wondering about some particulars of your setup for this.
How are URLs handled between the three sites (1 desktop, 2 mobile)?
Are you serving up unique source code on the same URL per device, or do you have device-specific URLs for all content?
What are you using to detect the useragent and redirect the user? Is this happening server-side, or with JavaScript?
The particulars of your setup will determine your best approach. When in doubt I would follow the instructions on this page.
I would not expect two mobile versions of your site to cause a duplicate content issue - more likely that Googlebot Mobile will only see one version of the pages and index those (but as above, the technical particulars will determine this).
Best,
Mike
Thanks for chiming in on this, Joost.
I wasn't 100% certain that geo_sitemap.xml was a problem, but the xmlns reference to http://www.google.com/geo/schemas/sitemap/1.0 in line 2 I thought might be throwing Google off - I take it they'll just ignore this and crawl the doc as any other XML file?
Thanks again.
Hi Anthony,
Sorry for the delay on this. In migrating over to the new Moz.com platform, Q&A messaging for admins has been a bit spotty.
You are right - geositemap.xml is using the "geo sitemap" protocol that Google no longer supports. This may cause Google not to follow the reference to locations.kml contained therein.
Unfortunately I don't have an alternative recommendation to Yoast's SEO plugin for this. Manually creating your XML may be your best option, or using software like GSiteCrawler to speed up the process, then manually add your KML file.
If this output from Yoast's plugin can't be manually configured, and the KML file is important enough to your goals that you consider it a top priority to have it crawled, it seems a clear choice to me to move away from this plugin and find a better solution. Unfortunately, I haven't dealt with KML files for WordPress in the past. I'd probably recommend site crawling software to speed up the process, then switching to manual to add this in.
Best,
Mike