I would certainly look at having more brand term anchors than commercial ones so I would start at greater than 50%.
Best posts made by Adam.Whittles
-
RE: Proportion of branded keywords vs targeted keywords in anchor text distribution
-
RE: Google doesn't rank the best page of our content for keywords. How to fix that?
Hi Julien,
Having looked at the backlink profile for both pages I can already see one problem. Your 'Annecy' page has very few links pointing to it whereas your 'Haute-Savoie' page has considerably more links to it. This is not a major problem in itself, however the problem I can see is with the anchors of these links. The majority of links to your 'Haute-Savoie' page contain the anchor 'haute savoie annecy'. This could be contributing to your 'Haute-Savoie' page outranking your 'Annecy' page. You can combat this by creating more links to your 'Annecy' page.
You could also improve your on-page optimisation for both pages. Again, having looked at your 'Haute-Savoie' page, it is quite well optimised for 'Annecy' rather than 'Haute-Savoie'.
Hope this helps,
Adam.
-
RE: Keyword canibalization
Hi Jasper,
This isn't necessarily a bad thing. It really depends on what your intent with the pages is and which page is ranking higher.
If the most relevant page and the page you want to rank higher is indeed ranking higher then you don't really have a problem. Look at it from this point of view, if you can dominate a whole page of google with your pages (very, very unlikely to happen) it increases the chances of users visiting your site.
It is only really an issue if you are deliberately targeting the same keyword for different pages and want to rank for the same keyword on separate pages. If you are targeting different keywords for each page and they are ranking well for the keywords you are targeting, then it isn't a huge problem that they happen to rank for the same keyword.
Of course, if this is the only keyword that the two pages rank for or the page that is ranking higher is not the page you want to rank higher, then you do have a problem concerning keyword cannibalisation. In this case you will need to analyse both pages to see what is causing the problem. For solutions to the problem you should read Rand's article.
Hope that helps,
Adam.
-
RE: Duplicate page title on blog
As I said below, you do not need unique titles for these pages. When using pagination on a blog i.e. blog?page=4 and blog?page=5 you cannot avoid having the same page titles. Yes you should have unique page titles wherever possible but this is a case where it is not.
Again, you should make sure to use rel="next" and rel="prev" which you can find out more information about in the link I provided.
-
RE: Is it possible to override the 10k pages crawl limit on PRO?
You might want to consider upgrading to Pro Elite. The crawl limit is increased to 20,000 pages.
-
RE: Duplicating Keywords in Page Title
Hi Robert,
Firstly, the SEOMoz on-page report is really only a guide and you shouldn't worry too much about using it. To be honest, I rarely use it and just follow best practices.
That said, there are still several alternatives you could try:
1. You do not necessarily need to have your company name in the title. This is a practice that is quite common place nowadays but realistically, unless you're a strong brand that has huge volumes of branded traffic, you don't need the company name in the title.
2. I would maybe suggest targeting the two keywords on different pages i.e. a Widget Program page and a Widget Software Page. This maybe the most user friendly option and probably the best option overall. Especially if the two keywords attract large volumes in traffic.
Hope this helps,
Adam.
-
RE: Blog.mysite.com or mysite.com/blog?
Hi Tim,
I generally prefer to go with the subfolder option (mysite.com/blog) rather than the subdomain (blog.mysite.com). The reason I prefer this option is because having the blog in a subfolder means that it will benefit from the value of the root domain. In other words, links that are obtained by the root domain will pass that value to the subfolders. However, a subdomain is treated as a separate site and therefore not much value is passed via the root.
Rand provides an excellent answer in a previous Q&A of a similar topic:
Hope that helps,
Adam.
-
RE: Where to find quality bloggers with seo background for sub contract work
-
RE: Duplicate Title
Hi Tammy,
You have not solved the problem and this was not the root of the problem in the first place.
You will now need to setup a 301 redirect from the non-www version (http://carolynnescottages.com.au) to the www version (http://www.carolynnescottages.com.au) as well as implement my previous recommendations. On further inspection, after doing a site: search of your site, there are both non-www and www pages indexed. You may also need to set your preferred domain in Google webmaster tools as well.
'Deleting' the carolynnescottages.com.au, as you put it (although you actually haven't deleted this as such), will not resolve the duplicate content issues you had previously.
-
RE: "/blogroll" causing 404 error
Hi Andrea,
If the crawl is returning 404 errors then this means, although you have removed the widget, the pages are still being linked to somewhere on your site.
My advice would be to use the Screaming Frog crawler or if you have access to another crawler then use that. Once you have crawled the site using a crawler, you should be able to find out which pages are still linking to the 404 pages. Once you have found these, you will get a better idea of how to fix the issue.
Remember, a crawler will crawl your entire site, including all links, and if 404s are found then these are being linked to internally.
Hope that helps,
Adam.
-
RE: Keyword Ranking
Your question has two parts to it really.
Is there an easy way to reduce this list without having to go in word by word and pause or delete them?
Unfortunately if you have started a campaign and want to keep the historic ranking data of certain keywords whilst removing others then you have to manually remove the keywords you no longer want. Otherwise you will lost the historic data.
Like is there a bulk upload type of system?
There isn't a bulk upload option but there is a way you can simplify the task. As Donnie says, you can upload a comma separated list of keywords. If you are downloading a list of keywords from google to upload or you have a list of keywords on separate lines, just put the words into a text editor and use regex to replace line breaks with a comma (replace \n with ,). Then you can simply copy and paste into SEOmoz.
If you do not need the historic data then you can just follow this method.
Hope that helps,
Adam.
-
RE: Is reported duplication on the pages or their canonical pages?
Hi,
The URLs that are reported by the crawl as being duplicates are the duplicate pages. Unfortunately the way the crawl from SEOMoz works, it does not factor the rel=canonical tag when reporting duplicates. In other words, even with the tag implemented, it will still report these pages as duplicates. Don't worry though, as long as the tag is implemented, the search engines should treat the canonical like a 301 redirect and not penalise you for duplicate content.
So to answer your question:
Are the pages with the query strings the duplicates? - Yes.
Hope that helps,
Adam
-
RE: "/blogroll" causing 404 error
Sorry Ben but I have to disagree with you here. That is very bad practice and also very poor advice. You shouldn't just ignore 404 pages from a site crawl.
Really the only time you should let pages just 404 is when Google has indexed them, there is no relevant page on your site to redirect them to, there are no high value links pointing to them and they are not being linked to from within your site.
However, in this case the 404 pages are being linked to from within the site. This means that value is being passed to these pages from within the site that could otherwise be passed to other pages.
Best practice in this situation is to fix the links that point to the 404 pages and 301 redirect the 404 pages to relevant pages on the site.
P.s. running a quick site crawl and fixing the 404s should only take minutes and not hours to do!

-
RE: SEOmoz ranking report SERP
It is important to note that even though you are using the gl=us parameter in your search string, it does not guarantee that you will see the same search results from that location if you are in another location. This is because you will be accessing data from a different Google data center.
-
RE: A google ranking problem
There could be a number of factors affecting your rank in Google.com compared to your local Google search engine. Ultimately, Google will return results it believes are most relevant to the particular country. If there are strong local indicators on your website (local hosting, local domain extension such as .co.uk, localized content, links from local domains, setting a geographic target in webmaster tools) and very few international indicators, then you cannot expect to rank well for both Google.com and your local Google.
On a side note, having keywords in urls is not black hat and is even encouraged to help pages rank. I think you may be confusing keywords in a url e.g. http://www.company.com/**keyword** with exact match domains e.g. http://www.keyword.com. However, the use of either is not considered black hat seo. It is worth noting that exact match domains are receiving less importance from Google in serps in preference to brand sites.
Adam.
-
RE: New EMD update effected my mom's legit author page? From page 1 in SERP to nowhere for her name
I did a little digging around your mother's site (margaretterry.com) and I don't think you have been hit by the recent updates.
Correct me if I'm wrong but it looks like the domain was created in February of this year and I assume the site didn't go live till even later?
Having looked at the site with opensiteexplorer.org, there are no links being picked up and the page (and domain) authority is 1/100. Basically your site has no authority behind it.
When you initially launch a new domain, it is not unusual to see it bounce around. I also believe that Google tends to give new domains an initial artificial boost to help get started. However after a period, the site will be expected to rank on its own. This means that if your site has no links, you will not be able to rank well.
My advice would be to start building links and authority to the site. I'm afraid it's not a quick fix but it looks like your only option.
-
RE: SEOmoz ranking report SERP
No it does not get the serps from that location. Mainly because you are not based in that location and therefore Google will still pull the data from your closest data center. According to Google,
_the gl parameter boosts search results whose country of origin matches the parameter value _
Therefore, you won't get exactly the same results as if you searched from the location itself.
-
RE: Should I include language specific characters in the URL?
I guess the easiest thing to do would be to test it yourself.
Fo example, when I search [über] in Google.de (with German location settings), after a Google result for the number one position, the next 2 results have ueber in the URLs and not über.
Hope that helps,
Adam.
-
RE: Google Places and Pre-Selected Categories
You could try this Google Places Category Tool.
Adam.
-
RE: Switching from a .org to .io (301 domain redirect)
In that case, I would probably just stay with the .org. Is being a part of the Apple affiliation program essential to your business? If so then I would consider trying to find a different domain.
You will probably struggle to rank in the main English language territories with a .io domain. I can't recall ever seeing one in the serps and hadn't even heard of it before this question was asked!
Don't be mistaken into thinking that 301 redirecting your old domain to the new one will instantly get the new domain to the same rank as the previous. It doesn't quite work like that. Of course, this will depend on many factors.