I don't think there is any tactic happening. They simply are building lots of mini websites for their clients and messed up on no following affiliate links. it appears that they have not done any of the basic SEO audit work on their system. Nothing deliberate here IMHO.
Best posts made by MichaelC-15022
-
RE: Ever seen this tactic when trying to get rid of bad backlinks?
-
RE: YouTube vs. LimeLight - What are the SEO pros and cons of each platform for on-site video viewing?
Hi Jake,
I'm not familiar with LimeLight, but Phil Nottingham did this great writeup on YouTube vs. hosting on other platforms.
One of the things you need to be concerned about is: will Panda recognize the embedded video as rich content on the page? iFramed solutions might not be....traditionally, Google has NOT treated iframed content as existing on the page (although I've seen a couple of examples with clients' sites where iframed-in content has caused the "wrapping" page to rank for content that's only in the iframed).
I'm a big fan of embedding using Wistia, using their video SEO embed type. It automatically creates not only your video sitemap, but also embeds schema.org/VideoObject markup on the page, so that Google absolutely can tell that there's a video embedded there, and what it's about etc. as well.
Michael
-
RE: Dealing with non-canonical http vs https?
You're on the right track. Force it all to https, and keep the rel=canonical pointing to https versions.
Check out this thread of questions to Google's John Mueller on this topic:
Make sure you test very thoroughly before launching the https-only version: you'll run into issues with things like images, CSS, Jscript referenced via http instead of relative or protocol-free referencing. Same goes for your internal links: you don't want to throw away a ton of link juice (even if only 5% at a time) because of 301 redirects from http to https that you could have fixed :-).
-
RE: 2 Question about URL structure
I'd be inclined to go shorter. I don't believe you're going to see any additional ranking benefits from having the keyword in the URL twice (might be different if the keyword was in the domain AND the URL, but even then...).
I'd be a little concerned that having the keyword in there twice might look spammy to Google, too.
-
RE: A/B Split Testing - Rankings Drop? Need an expert opinion...
How big is your site? Is the page being A/B tested a pretty strong page (PA, PR)?
Let's say you've got a relatively small site, with a few dozen pages, and the page in question is one of only a couple linked to from the main nav. So it's one of the stronger pages on the site. Blocking it in robots.txt means Google isn't going to continue to distribute link juice from that page to other pages on your site, so all of those other pages get slightly weaker.
In general, robots.txt isn't the place to block Google, as you throw away the outbound link juice from that page that's blocked. Instead, you'd want to do a meta robots noindex,follow on the page itself.
If you've got a big site, and this particular landing page isn't a major portion of the overall site in terms of PA, then you shouldn't have seen an effect like I've described.
-
RE: Canoncial tag for homepage?
Oh, I think you're asking for trouble if you do that.
Unless your home page is really badly targeted for that keyword, or the internal page has the keyword in the URL and your domain doesn't, or somehow you've built a ton of links to the internal page and NOT the homepage, this is really suspicious, and makes me think that Google might be levying a penalty against your home page.
If that's the case, the canonical tag runs the risk of transferring the penalty

First off, install Chartelligence in your Chrome browser, and then take a look at your Google Analytics to see when the traffic dropped for your home page, and from that, if that lines up with a particular iteration of Panda or Penguin.
-
RE: How can I optimize pages in an index stack
Hello Rod,
Can you explain what you mean by an "index stack"? I haven't seen that term used before.
-
RE: Does a JS script who scroll automaticaly into pages could make some content "hidden" ?
Depending on how you cause the scroll to happen, Google might render the page unscrolled or scrolled. Usually if it's done in the onload() function via Jscript, Google will execute that script and render the page as it is after the script is executed. I've seen examples though where using JQuery's document ready function is NOT executed by Google to render the page.
Test in Google Search Console, using Fetch and Render as Googlebot.
-
RE: Miriam's 7 Local SEO Predictions for 2019
I think most of the community is currently Christmas shopping online...and making decisions based on fake reviews :-p.
-
RE: Over optimization of Anchor Text | Consequences, Guidelines, Precautions
First off, I'd get rid of all of the article marketing links. Disavow those domains right away. If you get a manual penalty, you will not get the penalty removed if there are still examples of article marketing out there.
While you cannot control how people link to you, you should expect the vast majority of people who link to you without being coerced by you to link with your domain name or your brand name.
It's hard to know precisely what % of anchor text can be keyword-targeted, especially as an updated Penguin algo came out just a few days ago, making studies older than that come under question :-). But there's a great post here from last year that you should read, where it would appear that you'd be ok keeping your optimized anchor text links at 25% or less (which I would agree with).
But to see if you ARE suffering from a Penguin penalty for over-optimized anchor text, spend a little time in Google Analytics, drilling down into the organic search area, and look at your top 5 or 10 keywords, and see if traffic from those took a hit around one of the Penguin updates. Chartelligence is a really nice Chrome plugin that can help with this.
-
RE: Link Detox or I can use Open Site Explorer for tracking down bad links?
Recognize that there are three major types of "bad" backlinks that you need to look for:
- sitewides
- over-optimized anchor text
- links from really stinky places
Link Detox is great--I use it all the time to diagnose penalty issues. BUT it only really helps with links from spammy domains--it won't help you spot over-optimized anchor text. I use OSE's anchor text tab to do that.
To spot sitewides, you'll probably want to download the entire backlink profile from OSE into a spreadsheet, then sort by domain, and look for massive blocks of links from the same domain.
-
RE: Local search ranking tips needed
Hey Alex, in the U.S. at least, a good trick for cementing Google's understanding of your client's locations is to manually add each place in Google MapMaker. EVEN IF IT'S THERE ALREADY. Just go through the process and try to add it, when it asks if it's a duplicate, accept that option. It'll then say it's discarding your changes, but click OK and then Save. Mike Blumenthal told me at lunch at SMX Advanced this year that he recommends doing that every couple of months even if there don't seem to be changes. If I remember correctly, he said Google trusts the MapMaker updates a LOT. Pretty sure David Mihm told me the same thing. And from my own client's little projects, I know that they're manually reviewing these, so that makes sense.
See if that helps...it'll just take you 5 minutes to do it, and then a week or two before they review your update.
Google's confidence in a business' current physical location seems to be pretty important to their algo (and it makes sense, as they look foolish anytime they direct a consumer to an empty office!).
MC
-
RE: Links being reported in Webmaster Tools
What it really comes down to is this: how deep does each of the link-reporting tools crawl? In my experience (lately, from doing a lot of penalty recovery work for clients), I'm seeing GWT crawl a fair bit deeper than most of the other tools out there, so their number is going to be higher.
There's a lot of pages out there that really aren't content pages: things like reply-to URLs, media-only pages in WordPress, standalone blog comment pages, etc.
When it comes to link juice, these kinds of pages have really almost nothing to give to a page they link to, so for the most part, whether you ignore the bottom 1% or bottom 10% of links to your site, you're not missing anything that matters. The exception to this would be a penalty situation, e.g. where there's a bunch of blog comment spam on really, really weak sites....Google is going to want to see that stuff cleaned up, even though it's not really contributing anything of significance to PageRank.
-
RE: Over optimization of Anchor Text | Consequences, Guidelines, Precautions
Deleting is generally preferable, but often harder. Short-term, you might disavow so that if Bing isn't penalizing you for those links, they still get counted there. If you run into a manual penalty, Google's spam team will want to see serious effort AND results of trying to get links taken down.
Really depends on the quality of the site. If it's a spammy site, then disavow/delete. If it's not a terrible site, then try to change the anchor text so you retain the link.
-
RE: Access to a Clients Google Places Account
I agree with Mr. Wignall's and Ms. Ellis' responses--tell them to change the password temporarily for you, then change it back afterwards. I always tell clients that it's partly for MY protection, i.e. if they have someone else at their company who knows their login (or their login is hacked), I want a 0% chance that they think I might be the problem.
Also, you do NOT want to claim it under your own account:
- you would be artificially giving Google information that that account and other clients of yours might be co-owned, or co-managed (so, are any links between them really editorial?), and more importantly
- Google local search has more confidence in the listing being claimed by the real owner if the domain name of the email address claiming it = the domain name of the website (see this thread as well)
-
RE: Should my canonical tags point to the category page or the filter result page?
Hi Daniel,
You're going to have to walk a fine line between having a page for every possible combination of filtered results that a user might search for AND appearing to have a ton of pages that are really almost identical....and suffering the wrath of Panda upon seeing what it thinks is duplicate content.
The easy way out is to have 1 page for each category, and no matter what filters are applied, rel=canonical to that category. Dupe content problem solved.
So why isn't this the ideal solution?
#1 You may be missing out on targeting combinations of categories and filters that users will commonly search for. Let's say you were selling clothing, and a category was shirts, and you had a filter for men/women/boys/girls. By making all shirts list pages rel=canonical to the overall shirts list page (with no filters), you'd be missing an opportunity to target "boys shirts".
#2 You may be missing opportunities to pour more link juice to the individual product pages. It's unclear (to me, anyway) whether Google adds the link juice from all pages rel=canonical'ed to a page, or whether Google simply treats rel=canonical as "oh ya, I've already seen & dealt with this page". Certainly in my testing I've seen places where pages rel=canonical'ed to another page actually still show up in the search results, so I'd say rel=canonical isn't as solid as a 301.
So what do you do? I'd recommend a mix. Figure out what combinations you think you can get search traffic from, and find a way to break down the complete set of combinations of filters and categories to target those, and to rel=canonical every page to one of your targeted pages.
It's entirely possible (likely, even) that you'll end up with a mix. For instance, going back to my earlier example, let's say you had another filter that was, let's say, price range. You might want to target "boys shirts", but not "boys shirts under $20". So, while "boys" was a filter value, and "under $20" was a filter value, you might rel=canonical all pages in the category "boys" with a filter value of "shirts" to your page that has just that category and that 1 filter set, regardless of setting of the price filter.
Clear as monkey poop?
-
RE: How to add my company in google search search result in bangalore (INDIA)
I'm pretty sure that Moz Local only supports US locations at this time.
-
RE: 1000 of links on my website ? is it good or bad
You've definitely got a lot of links in there in your navigation, but I don't think it's really going to hurt you. It's doing to distribute link juice pretty evenly across all of the pages linked to from that left-hand menu and the submenus.
You MIGHT consider reducing the left menu to just the main categories, and NOT including all of the submenus. That would funnel a fair bit more link juice to the main categories, and the submenus I'm guessing are all pretty long-tail anyway, so they shouldn't need as much.
I don't see a duplicate content problem here. I'm seeing different content on every page. You might consider putting noindex/follow on some of the intermediate pages that merely list subcategories, like this one:
http://dorchdanola-netbutik.dk/category/belysning-el-artikler-485/
These kinds of pages will be seen as extremely content-light...and that's not so good.
I wouldn't nofollow your social media links. And rel=me mostly you use to link an author page back to your G+ profile, so I wouldn't use that here.
-
RE: Should my canonical tags point to the category page or the filter result page?
I agree, that's a great approach. I think you mean Javascript, not Java though (that's a different language). The only thing that might make this approach a challenge would be if you had so much product data before filtering that it caused a performance problem, i.e. let's say you had 50 pages of results...if you filter server-side, you're only sending down 1 page of results, whereas if you're filtering with client-side Javascript, you've got to send all 50 pages down and then filter it in the browser.
