Hi Amelia
To follow on from Chris, can I ask if you're asking about getting the stars in your paid ads or in the organic listings?
If it's the latter, you'll need to have schema markup on the page itself, here's a Google guide to it.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Amelia
To follow on from Chris, can I ask if you're asking about getting the stars in your paid ads or in the organic listings?
If it's the latter, you'll need to have schema markup on the page itself, here's a Google guide to it.
Hi there
Both Articlebase and Ezines have nofollow links - meaning no equity will pass. In fact, most article submission sites will do and you should really move away from using them as an SEO tactic. This also means that OpenSiteExplorer is accurate.
Hope this helps.
Hi Gary
I think what Bright Local are looking to do here is earn you more business citations, rather than directories. It's funny, because the difference is only a subtle one, but rather than getting you on websites that are more "traditional" directories, they instead will look to get you listed on places like yelp, yell, forsquare, bing, yahoo, qype and so on.
Those kind of citations definitely do help. It does look as though Bright Local offer directory listings on moderated, niche specific and local directories, which is less of a risk (but valuable? I'm not so sure). In the package you can also hand-pick which ones you want - so you could opt out of them all if you chose to, which might not be a bad idea.
Looks like it automates/outsources what can be a pretty laborious process, so it might be worth looking into if the price is right. Bright local are a pretty reputable brand so I'd say the quality would be there.
Of course, you could build those citations yourself and good place to start is right here on Moz. Moz has the best ones by city, and best by category, while BrightLocal has an international list, as does WhiteSpark. You can probably find more by searching for "best citations for [country]".
Hope this helps.
Hi Tom
It's quite oxymoronic isn't it? "Natural" link building is in itself a strange term, while Google advocates that you don't proactively link build, yet it's algorithm still relies so heavily on it.
In the purest sense, if you are ever proactively looking to build or earn links to your site from another Google is unhappy about it. The reality is though that's completely unrealistic (never mind hard to enforce and hypocritical from them).
When people talk about "natural" link building, they are talking about the kind of link building or earning that leaves you to the lowest possible risk from Google. Many people have their different definitions on the topic, but here's mine:
A "natural" link is one that is:
To explain these further: By "editorially" earned, I mean that you should provide something, be it an article, resource, video or whatever, that warrants being placed and shared on someone else's website. If your content really jumps out and makes me people go "Oh wow, look at this, I want to share it" or "Hey, that's really relevant to my site and my users", then any link you get as credit for that resource is "editorially" earned. What you'll often find is that it usually means that the site's you get links from are relevant and contextual to your own site - eg if you ran a fishing equipment website, you may get links from local angler clubs. You really want to warrant your link being in there by producing something of excellent value.
By "contextual", I mean that your link should appear within the body of the article or written content of the webpage. If you've "editorially earned" your content, this should usually happen - however what has happened in the past, even on Moz, is that authors or publishers would advocate the use of an author biography that contains a link to their site. In my mind, it is far better to have a link surrounded by the content/video/article/etc itself - as in put the link there while you're talking about it - as I feel Google are blanket devaluing author bio boxes (rightly or wrongly depending how you see it). Rather than leave it to chance, I'd rather get the link in the content of the page itself.
And finally, how the site links to you seems to be a big factor in "natural" link building. I think Google (again, rightly or wrongly) always look for an excuse to hold it against you if they find you have links going to your site that contain anchor text that you might want to rank for. It's quite ridiculous, I know, but in order to keep things "natural", I'd say you should only have links that contain your brand, just your URL, or any miscellaneous anchor text (eg "check out what these guys did at the weekend").
So when we talk about "natural" link building that's what I think people mean. Now, is this the way you want to do your SEO? Is it the best way? Is it the most suitable or usable and will it give the greatest return of investment? That's all up for debate. I wouldn't say I'm either for or against this method - and it really does differ depending on your resources and industry. There are plenty of people there doing "unnatural" link building and getting results and ROI. One thing I would say though is that it's hard to build and keep a brand presence if your site is being penalised, so natural link building does come into its fore there.
I hope this helps - as you can see the question makes for a good debate and I hope my point of view helps explain a few things.
Hi Thomas
It really does not matter anymore. If anything, there might be more reason to use a hyphen rather than a |.
Starting about a year ago, Google seemed to be rewriting the title tags of pages that looked like: "Keyword 1 | Keyword 2 | Brand.com" into "Brand.com: Keyword 1 | Keyword 2" - Here's an article on that. I've seen it happen with hyphens too but far less often.
But in any case, use whichever option you prefer. I like the hyphen too, but there's nothing right or wrong with either - just be sure to make your title tag designed for the user and for clicks rather than search engines (out with the example I used earlier, in with "Get Keyword 1 with free USA delivery - Brand" style title tags - and make sure it will show up in search and not be shortened. You can use Moz's title tag tool for that.
Hope this helps.
Hi Marcel
Actually, I think the warning signs for this started back in November 2013, when Matt Cutts was asked about meta descriptions and if we should use templates or make them all unique. See this article and this one.
To summarise those articles, he recommended that you should not use a template as a meta description, as you have expected. He recommended unique meta descriptions for all the pages you want to rank, but for other pages it's totally fine to use no meta description at all. Google can generate a decent description for you if you leave the tag out in your HTML.
However, if you want a page to have your own description I'm afraid you'll need a unique one for each. I'd prioritise those products that are your big sellers or those with the best margin and write them first. I would also remove the template ASAP so that you have no description by default and then add them in when you have the time. That's the only way I can see getting your unique descriptions back I'm afraid. Try to work out a priority system with your team.
Hope this helps.
Hi Dan
I wouldn't want to leave anything to doubt and would prefer to have 1 version of each URL available.
Fortunately, a fairly simple solution can be put in place in your htaccess file. As always, please backup and test before trying any implementation - I can't tell you how many times I've made a simple mistake in the htaccess file that causes big problems!
Anyway, the code you'd want to enter at the top of the file is:
RewriteEngine On
RewriteMap lc int:tolower
RewriteCond %{REQUEST_URI} [A-Z]
RewriteRule (.*) ${lc:$1} [R=301,L]
That code will basically rewrite any URL containing uppercase letters to the same URL using only lowercase.
Redirects are quicker and more reliable than canonical tags in my experience and this doesn't take long to get implemented, so best not leave anything to chance.
Hope this helps.
I'm afraid your blog pages are in fact duplicate content, in Google's eyes anyway.
The /us/blog, /uk/blog and /ca/blog examples are all separate URLs that you are asking Google to index (separate canonical tags for each and no robots instructions that I can see). Google is going to look at these and any blog posts within them as separate pages. Once it realises they all have the same content, it will likely result in a Panda algorithmic penalty.
The risk here is that this penalty might affect your entire domain, rather than the offending pages. I really don't see that as a risk worth taking. Therefore, I strongly advise to remove the separate versions of the blogs and consolidate into one blog, with redirections of the local blogs to the new ones. Failing that, choose one version and instruct Google not to index other versions of the page by using a meta robots tag in your header, or in the robots.txt file.
I also advise that you noindex the category page to be sure that its content isn't being seen as duplicate either. More info on how to do that can be found in the Moz Robots Guide.
If it's any consolation, given that you're in the insurance niche, it is one that Google wants to keep a close eye on and keep clean - so chances are action would be quicker than if you were in the dog accessories industry, let's say!
We have to remember that, for all of Google's progress, it is still an algorithm that relies heavily on links and its internal PageRank.
If your site has links from places with a strong PageRank, that is almost "reason enough" for the algorithm to rank your site (in the very most basic terms).
Of course, the greatest advancement Google has made in recent times is teaching its algorithms (and manual quality assessors) to detect manipulative link building. They're getting faster and better at detecting them. Rest assured, that site will inevitably be penalised.
In terms of what you can do, look at it objectively: Would you want to beat them at their own game and do this, but have no guarantee that it would work for you and an almost iron guarantee that it will inevitably penalise you? It doesn't make sense to me - so I would keep doing what you're doing and look to earn more links in the way that you have. You might improve things now, but down the line your site is going to be in altogether much better place.
Keep at it mate - it's frustrating I know, but it'll turn around.
Ah, I see what you mean Alan and I'm inclined to agree. With the JS you mentioned, there's no risk of a user (or crawler) being taken away from the page itself, so no link would be passed and "diluted", as it were. Thanks for posting this!
Again, it's one of those where we are reliant on Google with providing us with dates of updates, and they have suggested they will stop doing that altogether. Similarly, a Penguin refresh is different than an update and arguably occurs more often.
Only thing I would suggest here is to see whether your drop coincides with any SERP flux in Mozcast or Algoroo. If it does, do a Google search for "[date] google update" and see if anyone else reported any changes on that date. That might be able to tell you more.
Hi there
I see what you're getting at here, but there's no great cause for concern.
So, all of those hash links (#) are internal anchors. Link juice (specifically Page Rank) does not pass and is not lost with an internal anchor. It only passes on full links - like the text links you have to other pages on the homepage. It won't try to pass (and then fail) on any internal anchors - so link juice won't be "killed" by them, so to speak.
One thing you'll need to keep in mind is how people link to you externally. To ensure all of the "link juice" from an external site is passed to yours, make sure that they link to you without any internal anchors (basically, your URL without any of the #'s).
However, even then, my recent tests have shown that the majority of the strength (if not all) will pass anyway. Seems to me that Google at least treats these internal anchors very well and will recognise where to pass the "link juice" even if one is present in a URL.
So, from a strictly SEO and internal structure point of view, I don't think there's too much to be concerned about here. If it works for you user experience wise etc. I'd say keep it for sure.
Oh wow, that's a lot of spam. Very sorry to hear your client was a victim of this.
If the site was hit by an algorithmic penalty (which could be very likely given the amount of low quality spam links the hackers built) that might account for the sudden drop. That would also explain why your disavow-ing has had no effect thus far. It can take much longer for the algorithm (the Penguin portion in this case) to update and take into account any disavowed or removed links on that domain, hence why a penalty might still be remaining.
It's pretty much impossible to predict when that might happen - could be tomorrow or another 6 months. This is probably the biggest frustration many businesses have with Google - the sheer ambiguity on algorithmic penalties.
With all of that in mind, as sad as it is I think you might be better off starting on a fresh domain and taking all necessary precautions to prevent hacking. I don't like leaving it to chance on when the Penguin algorithm could update, leaving you with an indefinite period of waiting to see rankings come back. Similarly, we can't even be sure you have an algorithmic penalty, due to Google's vagueness on that matter as well (although I suspect at least one is at play here).
The way I see it, starting on a fresh domain gives you that control back, although I realise that may not be an option.
Hi there Chris
If I'm understanding you correctly, I also agree with your conclusion. If you look at JS loaded content like on the bottom of this page (click the "read more" at the very bottom), that is all content that Google can see and parse.
You can also put this to the test yourselves. If you go to SEO Browser, insert your URL and press "simple", it will show you how your page looks to the Googlebot. If you can see that content that you talked about in the result page, you can be sure that Google sees it too. Definitely one of my favourite tools
Hope this helps!
Hey Guillaume
That should be possible provided that you can add your Coremetrics tags as UTM values (which I'm sure you can).
Here's what Google says you need to do:
You need to use auto-tagging for non-Analytics purposes
If you need to (1) turn on auto-tagging in your AdWords account for purposes other than Analytics tracking and (2) wish to use manual tagging for Analytics, then you must enable the following setting to avoid data discrepancies:
Source: https://support.google.com/analytics/answer/1033981?hl=en-GB
Hope that helps.
Hi there
You need a closing paranthesis at the end of your canonical tag. Basically, you need a "/" before the ">", so it should read:
A free alternative (with upgradable options) is ProRankTracker - which will give you all of the same info as with SEO Book. I've found it to be very accurate. AdvancedWebRanking is another, but it does not have a free option.
I think you'd be happy with either of them (I am!)
For some terms it will change search volume. For example, a generic term like "casino" will have (or should have) a greater search volume for a "global" search than just an "english" search - because more than 1 language uses the word "casino".
For other terms like "best IT company in London", it's not going to change much, if at all.
What else do you offer?
Do you do free shipping? Running any discounts?
I'd be much more inclined to click a title tag that says: "Buy Cane Beds - Free UK Shipping | Boogie Beds". Looks like it fits in the Moz Title Tool too.
You definitely want to get your primary keyword in there - it's good for SEO and for users. But after that, make it as compelling as possible that will drive people to click your ad, even if you're #3 instead of #1. Your title tag and meta description are your shop window - make sure you stand out.