I agree with Federico. I've seen Google go fishing with URL parameters (?param=xyz) and I've seen it with AJAX and hashbangs as well. How far they take this and when they choose to apply it doesn't seem to follow a consistent pattern . You can see some folks on StackExchange discussing this, too: http://webmasters.stackexchange.com/questions/25560/does-the-google-crawler-really-guess-url-patterns-and-index-pages-that-were-neve
Best posts made by randfish
-
RE: How does Google index pagination variables in Ajax snapshots? We're seeing random huge variables.
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Hi Gary - yes! That's what the thread I posted above includes - details on what the differences between this new index and tool are vs. the old one. The bullet points basically cover it

-
RE: What makes a "perfectly optimized page" in 2013?
Hi James - I've been meaning to write an updated version of that post. I've got an email in my inbox with the task on my to-do list and will do my best to get to it soon. Sorry for the delay!
In the meantime, the comments above are very kind, but also accurate. It's still a pretty solid guide to on-page optimization.
-
RE: Massive amounts of incoming links caused by parked domains
Hi SoundofSongs - first off, welcome to Moz

As far as the penalty goes, the challenge is that it may not be an automated penalty, but rather a manual one. If that's the case (which is impossible to know for sure), fixing the problem alone won't be enough. You'll need to wait for someone from Google to review the site, and they'll only do that if you submit a re-consideration request.
Here's how I'd think about this. You didn't do anything wrong other than letting the domains accidentally become duplicates. If you set up the 301 redirects again and have all those sites pointing to your main site, you shouldn't have trouble. Just send in a reconsideration request to Google through Webmaster Tools and let them know what happened and that you've fixed it. It may take a few weeks or even a couple months for them to review the request, but if that's the only issue, you should be cleared.
If, however, there are other problems - bad links to the site from sources you don't control that Google thinks you've built or content/on-page/UX issues that Google doesn't like, fixing the parked domains won't be enough, and you'll likely get a response from Google in Webmaster Tools saying that your request has been denied. At least you'll know at that point.
As far as the landingpage.com solution - I don't think it's particularly necessary, but if you think some of the parked domains are or have been spammy and don't want them pointing to your main site, then go for it.
Best of luck!
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Thanks! Look forward to your thoughts on how this new version works for you/your business.
-
RE: Location Specific Reporting
Not yet... But it's coming really soon. I believe by end of May, you'll be able to specify only a particular subdirectory or an entire root domain to have crawled. In the meantime, you can restrict our crawler - RogerMozBot - from accessing certain areas to sculpt our crawl.
-
RE: Massive amounts of incoming links caused by parked domains
Happy Holidays to you, too!
-
It's possible that the penalty is related only to the duplicates, but the process I described above should help uncover whether that's the case.
-
It's definitely worth cleaning up the domains you owned that are duplicates rather than 301s. As for the other older links - it depends on how bad they are, how many there are, and whether you think there's risk that Google may penalize you for them. Unfortunately no easy answers there.
-
SEO isn't the only inbound channel! Social media, blogs, building an email list, content marketing, etc. are all options, too.
-
I like targeting hihgly relevant, rarely searched phrases. The CPCs tend to be lower and the quality of leads higher. And working on conversion rate is a great idea since that helps all kinds of traffic.
Wish you all the best!
-
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Thanks dMa - very glad to hear it. If there's things you think are missing or that can be improved upon, please let us know. Still in beta, obviously, and there are a number of new features and data improvements coming, but we're hoping to regain the lead in the link tool space (I'm with you, it's been way too long).
-
RE: SEO Audit "Hybrid Site"
Hi Steve - sounds like a big challenge (particularly if you can't get access to analytics or WM tools). You could certainly start by getting crawls through SEOmoz PRO just to check for errors/issues/etc. Screaming Frog is also a good tool for this one-off type crawling that doesn't need tracking over time.
There's a few good posts on site audits in particular:
- http://www.seomoz.org/blog/how-to-do-a-site-audit
- http://www.seomoz.org/blog/4-ways-to-improve-your-seo-site-audit
- http://www.seomoz.org/blog/seo-site-audits-getting-started
- http://www.distilled.net/blog/seo/do-your-very-own-site-structure-audit/
Best of luck!
-
RE: Do Page Views Matter? (ranking factor?)
I think there are elements of both iSTORM's and David's responses that are accurate. Page views in and of themselves are almost certainly not a raw ranking factor, but it could well be that engagement metrics that correlate well with page views (in many cases, at least) do have a direct or indirect positive impact on rankings.
I try not to guess at precisely the elements Google is or isn't using to influence the algorithmic rankings (based on what I read about their move to deep learning, it probably doesn't matter much anyway since the algo is becoming derivatives of thousands of metrics' interplays), but instead worry about the things that will cause the results and user experiences Google wants to reward. That was a lot of what this post was about: http://moz.com/blog/seo-correlation-causation.
-
RE: SEO and Squarespace? Is this Really an Option?
Hi - it certainly looks like there's a number of issues around basic SEO friendliness and accessibility that need addressing on that site, but I'm surprised that SquareSpace's CMS doesn't allow for/enable that. Can you edit the source code on the pages? Or contact their support to look into it?
BTW - I'd also suggest making the homepage title more friendly. Currently, it looks like SEO spam - just keywords jammed together without spaces and without the name of the business. All the page titles have inherited this problem throughout the site.
I might suggest reading https://moz.com/blog/visual-guide-to-keyword-targeting-onpage-optimization and https://moz.com/blog/on-page-seo-8-principles-whiteboard-friday which contain a lot more detailed information on how to think about keyword targeting and on-page SEO.
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Thanks Adam!
Re: 1) There is a toggle in the panel above the link lists to filter out common syndication and shortening URLs. Maybe try checking that? If it's not catching everything, let us know what (here or via a ticket to help@moz.com) and we can try adding those to the filtration.
Re: 2) Yeah - we have the "linking domains" view in the tool but adding a "one link per domain" or "up to X link per domain" in the links view is certainly something we can consider, too.
-
RE: Own Domains shown as Spam Links in Open Site Explorer
Hi Marc - if you're worried about them being potentially problematic in Google, just disavow them vai Google Search Console (aka Webmaster Tools), then you can point the domains to any page or part of your site you want without concern. It's likely not a big issue regardless, but if you want to be sure, that would be how I'd do it.
-
RE: SEO and Squarespace? Is this Really an Option?
Thanks for the heads up Mirabile! Sad to see -- hopefully Squarespace fixes this soon and re-enables fully customizable Title & Meta Descriptions to the other page types.
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Thanks Jack! I've been pushing the team to get a pie chart of anchor text (there's a lot of folks at Moz who think pie charts are evil and bad UI, but I like them), so I appreciate the +1 for that feature

-
RE: What is this exactly? Whiteboard Friday warning about footer links.
Yep! The sitewide link penalty, also commonly known as the footer link penalty or the "web design by" penalty is a pretty common (though not 100% universal) link dampener. Google mostly just ignores those links now, but they sometimes used to actively penalize for their presence (and may still in certain cases). My best advice is to instead link from your about page or another well-linked-to page on your site vs. linking from every page.
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
I don't think there's much value there. We haven't seen the c-block metrics correlate any better than linking domains of late (not surprising given that c-blocks aren't as popular a way to hide link networks anymore), so removed them from the new product.
-
RE: Underscores, capitals, non ASCII characters in image URLs - does it matter?
It's not best practice for sure, and it doesn't help with SEO (ideally, you want clean, clear, descriptive URLs and paths). That said, if Google can index the content of the page (or the image), it's not a dealbreaker. You can check by using the URL/image path in Google Images e.g. this one.