Someone who will not be named (it rhymes with "Bomb Itch Slow") told me to Mechanical Turk the crap out of it 
Best posts made by Dr-Pete
-
RE: Tactics to Influence Keywords in Google's "Search Suggest" / Autocomplete in Instant?
-
Google's Mobile Update: What We Know So Far (Updated 3/25)
We're getting a lot of questions about the upcoming Google mobile algorithm update, and so I wanted to start a discussion that covers what we know at this point (or, at least, what we think we know). If you have information that contradicts this or expands on it, please feel free to share it in the comments. This is a developing situation.
1. What is the mobile update?
On February 26th, Google announced that they would start factoring in mobile-friendliness as a ranking signal. The official announcement is here. Of note, "This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results."
2. When will the update happen?
In an unprecedented move, Google announced that the algorithm update will begin on April 21st. Keep in mind that the roll-out could take days or weeks.
3. Will this affect my desktop rankings?
As best we know - no. Mobile-friendliness will only impact mobile rankings. This is important, because it suggests that desktop and mobile rankings, which are currently similar, will diverge. In other words, even though desktop and mobile SERPs look very different, if a site is #1 on desktop, it's currently likely to be #1 on mobile. After April 21st, this may no longer be the case.
4. Is this a boost or a demotion?
This isn't clear, but practically it doesn't matter that much and the difference can be very difficult to measure. If everyone gets moved to the front of the line except you, you're still at the back of the line. Google has implied that this isn't a Capital-P Penalty in the sense we usually mean it. Most likely, the mobile update is coded as a ranking boost.
5. Is this a domain- or page-based update?
At SMX West, Google's Gary Ilyes clarified that the update would operate on the page level. Any mobile-friendly page can benefit from the update, and an entire site won't be demoted simply because a few pages aren't mobile friendly.
6. Is mobile-friendly on a scale or is it all-or-none?
For now, Google seems to be suggesting that a page is either mobile-friendly or not. Either you make the cut or you don't. Over time, this may evolve, but expect the April 21st launch to be all-or-none.
7. How can I tell if my site/page is mobile-friendly?
Google has provided a mobile-friendly testing tool, and pages that are mobile-friendly should currently show the "Mobile-friendly" label on mobile searches (this does not appear on desktop searches). Some SEOs are saying that different tools/tests are showing different results, and it appears that the mobile-friendly designation has a number of moving parts.
8. How often will mobile data refresh?
Gary also suggested (and my apologies for potentially confusing people on Twitter) that this data will be updated in real-time. Hopefully, that means we won't have to worry about Penguin-style updates that take months to happen. If a page or site becomes mobile-friendly, it should benefit fairly quickly.
We're actively working to re-engineer the MozCast Project for mobile rankings and have begun collecting data. We will publish that data as soon as possible after April 21st (assuming it;s useful and that Google sticks to this date). We're also tracking the presence of the "Mobile-friendly" tag. Currently (as of 3/25), across 10,000 page-1 mobile results, about 63% of URLs are labeled as "Mobile-friendly". This is a surprisingly large number (to me, at least) - we'll see how it changes over time.
-
"Update" in Search Console is NOT an Algo Update
We've had a few questions about the line labeled "Update" in Google Search Console on the Search Analytics timeline graph (see attached image). Asking around the industry, there seems to be a fair amount of confusion about whether this indicates a Google algorithm update.
This is not an algorithm update - it indicates an internal update in how Google is measuring search traffic. Your numbers before and after the update may look different, but this is because Google has essentially changed how they calculate your search traffic for reporting purposes. Your actual ranking and traffic have not changed due to these updates.
The latest updated happened on April 27th and is described by Google on this page:
Data anomalies in Search Console
Given the historical connotations of "update" in reference to Google search, this is a poor choice of words and I've contacted the Webmaster Team about it.
-
RE: Hoping someone could take some time to give me some feedback / advice. Thanks!
Thanks for sharing your story, Rick. My wife and I lost our first pregnancy due to Turner's Syndrome, so I'm painfully familiar with how random the genetic lottery can be. I'm happy to say we have a healthy, happy 17-month-old girl now. I'm glad to hear Noah is doing well, and I'm heartened to hear how proactive the doctors are being.
First off, I'd just like to say that you're doing a lot right. You have a well-designed site with great content, a good core structure, and many of the important features of a modern site/blog. The wide world of SEO can be overwhelming, but it's rare that you need to tackle it all at once.
I think it's great to be thinking proactively about categorizing your content, and it's ok to let that evolve organically as your needs are clear. Categorizing the videos certainly makes sense.
At this point, though, given that your basic structure is good and you've got a lot of content, the social and link-building aspects are probably equally or more important. You have one tremendous tool at your disposal - sincere passion that can connect you to an audience. Your own outreach efforts, interactions with other parents, discussion boards, communities, etc. will go a LONG way. As you build relationships, links will start building themselves.
One thing that wasn't clear to me until I fully read your post and dug into the site was that your wife is a pediatrician. The "Mom MD" just read like a cute category name to me (no offense intended - that was just my first impression). This fact, IMO, adds a lot of credibility to what you're doing, and makes this more than a personal blog. I'd make this clear, especially on the About page and at the top of the Mom MD section.
-
RE: Are press release sites useful?
I have some smaller clients who have had limited luck with it, but I think it's best to just stick to one of the sites and do periodic releases (maybe every couple of months). If nothing else, it'll give you a sense of what's working, and you can take some of the popular releases and push to get them a broader market.
What I wouldn't do is go after multiple low-value PR services and plaster the same releases everywhere you can. At best, it's diminishing returns - at worst, they'll be devalued. I'm with Peter G. - the best press release opportunities come through relationships with the media. Obviously, though, that takes a lot more time.
-
RE: Edu links service
Rand has a great post on link valuation:
http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links
There's no magic to .edu links, frankly - the data over the past couple of years doesn't really support that they're inherently better than .com's, etc. It's true that many .edu sites are high-authority sites, of course, but that's just a correlation (it's not that Google prefers .edu or .gov inherently).
Within any site, though, you have to look at the Page Authority, the number of links on that page, the placement of the links, the anchor text, the relevance (to some degree), and a lot of other factors. Let's take a non-edu example - DMOZ. People kill themselves for DMOZ links, but lately I'm seeing DMOZ listings where the entire page isn't indexed because it's so deep. No indexation means ZERO link juice. So, even though it's DMOZ, the link is worthless.
-
Crawl Test is now On-Demand Crawl!
If you've been with Moz a while, you may have used our old Crawl Test tool. A year ago we launched an all new, campaign-based Site Crawl (with an entirely rebuild crawl engine), but Crawl Test fell into disrepair and we haven't had a solid tool for crawling non-campaign domains.
I'm happy to announce that we've just launched an all new On-Demand Crawl, built on the new Site Crawl engine, with a UI that's focused on quick insights. Moz Pro Standard tier customers can run up to 5 crawls per month at 3,000 page per crawl (crawls are saved for 90 days), with per-month limits increasing at higher levels.
Most On-Demand Crawls should run in a few minutes, making the tool perfect to get quick insights for sales meetings, vetting prospects, or analyzing competitors. We've written up a sample case study or logged-in customers can go directly to On-Demand Crawl.
Try it out -- we'd love to hear your use cases (either here or in the blog post comments).
-
RE: Are Meta-keywords coming back?
The short answer is: No. They're not coming back, in the sense that anything has changed or that they carry any more weight than they did last year. All signs point to their continued decline. Google has publicly stated that it carries no positive ranking value.
Technically, Alan is correct - evidence suggest that Yahoo/Bing used Meta keywords as a ranking signal more recently than Google. Most of that evidence is 2+ years old, though, and I've seen no compelling reasons to think that it will tip the balance in any competitive situation on Bing. Even that 2009 article basically says: "Sure, use it, but don't expect much", IMO.
Here's the other problem - Meta keywords has been used as a negative ranking signal, and probably still is to some degree. In other words, you might not gain much or anything from using it, but if you spam it, you could get devalued. My gut feeling is that the negative signal is much, much stronger than the positive one, and even Google may still use it as a negative signal. I'm certain that Yahoo/Bing has used it as a negative signal (not sure if they still do).
I tend to agree that the competitive fears are overblown. Any decent site's keyword targets should be pretty clear - otherwise, it's not a very well SEO'd site.
Personally, if you want to use them, use them - but keep them short, sweet, and relevant. Once you do, get on with your life.
-
RE: How highly do you value a link from the BBB?
This gets into the realm of opinion pretty fast - it can be shockingly difficult to measure the value of one link. Here are a few of my opinions:
(1) One link is one link. It's rarely the magic pill people want it to be, even from a very authoritative site. I've seen people get a link like this and then wait on their hands for a sudden change in rankings, and it almost never comes. If you're just starting out and you have little or no link profile, a strong link can kick-start you, but I wouldn't pay $750 just to get a link if your site is established (I'm not sure I'd pay it even if your site is new).
(2) DA and PA both matter, and how much each matters can really vary with the situation. Your profile on a deep page of BBB is not an authority=96 link. It will carry weight, but the weight of any given profile could vary a lot.
(3) BBB has gotten a bit more aggressive, IMO, and I suspect Google will devalue these links over time. People tell me that they haven't yet, in this case, but it is, in essence, a paid link. Any day, Google could say "These BBB links are counting too much" and just lower the volume. So, don't put all your eggs in one basket, no matter what you do.
Now, to be fair, your BBB listing does have other value, like using it as a trust signal. The business case for spending the money goes beyond SEO, and that's a decision you have to make for yourself. If 100% of your interest in the listing is for a followed link, though, I personally would spend the money elsewhere.
-
RE: Looking for services to publish articles or blog posts with everlasting links.
We don't generally condone paid links here on SEOmoz, because we feel the risks often outweigh the rewards. However, Eppie fairly notes that they do (too often) work. The problem is that, even putting ethics aside, most people just don't do it very well.
I actually think Shane makes a good point - some of these links aren't really "everlasting" in the full sense of the word. Article marketing and paid blog posts often get archived quickly, and while the links continue to exist, they get rapidly devalued simply by moving in the internal structure. These paid networks have to continue to sell new links, and selling new links often means archiving old links and diluting existing content. So, if you pay once, expect your link to be treated like a 2nd-class citizen down the road. That's just the nature of that business, IMO. With a monthly fee, they can at least afford to keep your link active.
There are "paid" options that Google tends to not view as critically, such as:
(1) Editorially-reviewed directories
(2) Sponsorships and membership organizations
(3) Paid press-release services (although not really "everlasting")
People tend to only think of the big ones for (1) and (2) and often overlook niche directories, smaller organizations, local organizations, etc. The nice thing about the smaller sites is that you may be one of a half-dozen paid listings/sponsors, as opposed to one of 10,000 articles in an article-marketing network.
I'll leave this open as a discussion in case others have constructive suggestions.
-
RE: Canonicalization - Some advice needed :)
Seems like Matt and Marcus have you on the right track. With a real-estate site, duplicates and near-duplicates are very common, since you're adding and removing properties all the time and there are many search options and categories. I do agree that search-friendly URLs, long-term, where each property has a fixed URL, are definitely the best bet. In the meantime, though, a solid canonical structure helps a lot.
Ease into it - don't go sitewide in one fell swoop without a plan, unless you're having clear ranking problems. Start with your biggest problem areas, monitor/measure, and work from there. You can always check for indexed duplicates by running a Google search like:
site:daft.ie intitle:"176 Rathgar Road"
In this case, I'm not seeing any index issues, although I think Matt's concerns are valid.
I'd also consider rel=prev/next for search results pages, as that can help focus Google, too. Again, take it one step at a time and start with the biggest problems. It'll mitigate your risk all around.
-
RE: Issue now resolved
I don't think that multiple links back to a client from your site is a big issue. Plenty of sites have site-wide links - Google may discount those links after a while (5,000 links from 1 site isn't worth much more than just a few links from that 1 site), but it's not generally harmful unless it seems manipulative or is part of a larger pattern (like an obvious link farm).
For example, I once had a blog comment favorited on John Batelle's search blog, which meant it went into the sidebar and gave me, briefly, 1000s of links from his site. If anything, that was a net positive.
I have one concern, though - it sounds like this is happening because you're allowing any search that someone runs to become a page. This could have Panda implications and start to look like thin content. Google isn't really a fan of your internal search results (they want their search to land on deep content, not more search), so this tactic can spin out a lot of pages that look low-value. When you have 12+ Million pages in your index, this is something I'd look at closely. Even if you haven't been hit by Panda, you could be diluting your own ranking ability for pages that aren't very high-value.
Of course, that depends a LOT on the scope and how these pages are created. I'm not clear how they're generated or how many we're talking about.
-
RE: Canonical to the page itself?
I think it's good for some pages, especially the home-page, because you can naturally have so many variants ("www" vs. non-www, for example). It's a lot easier to pre-emptively canonicalize them than 301-redirect every URL variant that might pop up over time.
While Alan's concerns are technically correct, I've never seen evidence that either Google or Bing actually devalue a page for a self-referencing canonical. For Google, the risks of duplicates are much worse than the risk of an unnecessary canonical tag, IMO. For Bing, I don't really have good data either way. More and more people use canonical proactively, so I suspect Bing doesn't take action.
I don't generally use it site-wide, unless needed, but I almost always recommend a canonical on the home-page, at least for now. Technical SEO is always changing.
-
RE: Edu links service
I appreciate your transparency, but to me that looks like low-quality article spinning. It's ok to a point, and it may get you a short term boost, but those pages are going to be devalued over time. Plus, they have no other value (those links won't drive traffic).
As for the argument that Google can do whatever they want so that makes anything ok, I strongly disagree. There are link-building tactics that can create long-term problems. Should a client risk a full-on penalty for a low-quality link-building tactic that might get them a 5% boost for 3 months? For me to suggest that as an SEO would be grossly irresponsible. There are smart risks and there are bad risks.
-
RE: Do I need to add canonical link tags to pages that I promote & track w/ UTM tags?
I find Google is usually good about UTM parameters, but not always - for use in Adwords, they're almost never a problem, but when you use them for custom tracking, they can start to cause duplicates. Bing/Yahoo also don't handle them very well.
I'm not sure on the scope of your site/usage right now, so it's hard to give a definitive solution, but my gut reaction is that I would use canonical tags on the affected pages. If you want to double-check, you can test for the URLs in the Google index. Use something like:
site:example.com inurl:utm=
If they're not being indexed, you're probably ok, and can just keep an eye on it. If it's just a few landing pages, though (and not a massive, site-wide issue), I'd be proactive and put a canonical tag in place, if it were me.
-
RE: Removing Content 301 vs 410 question
Let me jump in and clarify one small detail. If you delete a page, which would naturally result in a 404, but then 301-redirect that page/URL, there is no 404. I understand the confusion, but ultimately you can only have one HTTP status code. So, if the page properly 301s, it will never return a 404, even if it's technically deleted.
If the page 301s to a page that looks like a "not found" sort of page (content-wise), Google could consider that a "soft 404". Typically, though, once the 301 is in place, the 404 is moot.
For any change in status, the removal of crawl paths could slow Google re-processing those pages. Even if you delete a page, Google has to re-crawl it to see the 404. Now, if it's a high-authority page or has inbound (external) links, it could get re-crawled even if you cut the internal links. If it's a deep, low-value page, though, it may take Google a long time to get back and see those new signals. So, sometimes we recommend keeping the paths open.
There are other ways to kick Google to re-crawl, such as having an XML sitemap open with those pages in them (but removing the internal links). These signals aren't as powerful, but they can help the process along.
As to your specific questions:
(1) It's very tricky, in practice, especially at large-scale. I think step 1 is to dig into your index/cache (slice and dice with the site: operator) and see if Google has removed these pages. There are cases where massive 301s, etc. can look fishy to Google, but usually, once a page is gone, it's gone. If Google has redirected/removed these pages, and you're still penalized, then you may be fixing the wrong problem or possibly haven't gone far enough.
(2) It really depends on the issue. If you cut too deep and somehow cut off crawl paths or stranded inbound links, then you may need to re-establish some links/pages. If you 301'ed a lot of low-value content (and possibly bad links), you may actually need to cut some of those 301s and let those pages die off. I agree with @mememax that sometimes a helathy combination of 301s/404s is a better bet - pages go away, and 404s are normal if there's really no good alternative to the page that's gone.
-
RE: How to prevent duplicate content at a calendar page
Sadly, the short answer is that you can't have it all. Either you index the separate calendar pages, get more pages/content out there and risk some "thinning" of your index, or you focus on one page, maximize the SEO value, but then lose the individual pages.
I would not 301 or 302 to the individual calendar URLs - that kind of daily URL shifting is going to look suspicious, Google will not re-cache consistently, and you're going to end up with a long-term mess, I strongly suspect.
I actually tend to agree with Muhammed and Paragon that a viable option would be to let the individual days have their own content, but then canonical to the main calendar page to focus the search results. That way, users can still cycle through each individual day, but Google will focus on the core content. In a way, that's how a blog home-page works - the content changes daily, but you're still keeping the bots focused on one URL.
Think of it in terms of usability, too. How valuable is old/outdated content to search users? They might find something relevant on an old page, but they still probably want to see the main calendar and view recent content.
Where are the links to the individual days, if "/calendar" always has today's content? I'm wondering if there's a hybrid approach, like letting the most recent 30 days all have their own URLs, but then redirecting or using rel-canonical to point to the main page after 30 days.
-
RE: Are pages with a canonical tag indexed?
I have to disagree on this one. If Google honors a canonical tag, the non-canonical page will generally disappear from the index, at least inasmuch as we can measure it (with "site:", getting it to rank, etc.). It's a strong signal in many cases.
This is part of the reason Google introduced rel=prev/next for paginated content. With canonical, pages in the series aren't usually able to rank. Rel=prev/next allows them to rank without clogging up the index (theoretically). For search pagination, it's generally a better solution.
If your paginated content is still showing in large quantities in the index, Google may not be honoring the canonical tag properly, and they could be causing duplicate content issues. It depends on the implementation, but they recommend these days that you don't canonical to the first page of search results. Google may choose to ignore the tag in some cases.
-
RE: Duplicate title-tags with pagination and canonical
Unfortunately, it can be really tough to tell if Google is honoring the rel=prev/next tags, but I've had gradually better luck with those tags this year. I honestly the GWT issue is a mistake on Google's part, and probably isn't a big deal. They do technically index all of the pages in the series, but the rel=prev/next tags should mitigate any ranking issues that could occur from near-duplicate content. You could add the page # to the title, but I doubt it would have any noticeable impact (other than possibly killing the GWT warning).
I would not canonical to the top page - that's specifically not recommended by Google and has fallen in disfavor over the past couple of years. Technically, you can canonical to a "View All" page, but that has its own issues (practically speaking - such as speed and usability).
Do you have any search/sort filters that may be spinning out other copies, beyond just the paginated series? That could be clouding the issue, and these things do get complicated.
I've had luck in the past with using META NOINDEX, FOLLOW on pages 2+ of pagination, but I've gradually switched to rel=prev/next. Google seems to be getting pickier about NOINDEX, and doesn't always follow the cues consistently. Unfortunately, this is true for all of the cues/tags these days.
Sorry, that's a very long way of saying that I suspect you're ok in this case, as long as the tags are properly implemented. You could tell GWT to ignore the page= parameter in parameter handling, but I'm honestly not sure what impact that has in conjunction with rel=prev/next. It might kill the warning, but the warning's just a warning.
-
RE: How far can I push rel=canonical?
I tend to agree - you always run the risk with cross-domain canonical that Google might not honor it, and the you've got a major duplicate content problem on your hands.
I think there's a simpler reason, in most cases, though. Three unique sites/brands take 3X (or more, in practice) the time and energy to promote, build links to, build social accounts for, etc. That split effort, especially on the SEO side, can far outweigh the brand benefits, unless you have solid resources to invest (read that "$$$").
To be fair, I don't know your strategy/niche, but I've just found that to be true 95% of the time in these cases. Most of the time, I think building sub-brands on sub-folders within the main site and only having one of each product page is a better bet. The other advantage is that users can see the larger brand (it lends credibility) and can move between brands if one isn't a good match.
The exception would be if there's some clear legal or competitive reason the brands can't be publicly associated. In most cases, though, that's going to come with a lot of headaches.