Official SEOMoz complaint: I never did receive my hug from Roger. Just saying...
Update: Hug received via Twitter!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Official SEOMoz complaint: I never did receive my hug from Roger. Just saying...
Update: Hug received via Twitter!
Hi Mickel -
The answer is both. Keep your html sitemap, and keep the link in the footer as is. Crawlers will look at these but they are generally more for human visits.
Then create your xml sitemap (www.url.com/sitemap.xml) and verify it in Google Webmaster Tools.
Hope this helps!
It looks like these guys do a lot more than just concrete repair. For that reason alone, I wouldnt try and fully optimize the home page for just concrete related keywords.
Since the site itself has very low metrics to begin with, you basically have a clean slate. Id optimize all of the services landing pages for their respective keywords (i.e. concrete repair, concrete repairs, concrete repair contractors for the concrete page, though I would not triplicate any one keyword) making sure they had good titles, good amount of unique copy and so on.
Its a natural approach to the taxonomy which always works well for Google placement. It wont get your to rank with on-page alone (except maybe some of the more obscure services) so some inbound links will probably be needed.
Good Luck!
Hi Ilya, I replied to another recent question so Im pretty sure I know you're dilemma 
From what we are seeing, averaging thresholds where your kw anchor is less than 20%, and then using Brand & URL, as well as others "click here" to fill in the rest. Brand is good I still wouldn't exceed a threshold of 30% on any one keyword. So for example, a company called Sparkling Flooring, we would use Sparkling Flooring no more than 30%, www.sparklinkflooring.com around 30% (but you can go higher with URLs, we just choose not to), and no more than 20% on 'hardwood flooring', etc.
Combined with excellent unique copy, a good information section like a blog or installation tips section, and other good Optimization techniques, and youre golden.
James can you elaborate a bit on your answer?
Hi Joshua,
There are a number of ways to stop Google from counting your dynamic urls as duplicates. Its unclear from your question why you can't use canonical tags for this. If you went here:
http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html
And add the canonical tag in the HEAD section:
It will solve your issue of duplication when people choose property variables like waterfront or bedroom #. I think you were trying to point out the reason this wont work at the end of your question but Im not exactly sure what you are eluding to there?
You do lose some juice on 301's so obviously the best course is to contact sites that house the link and ask them to change it to the new URL (once the new URL is live, and you would still 301 redirect). This isnt always easy to do particularly when there are 1000's of backlinks, so it really depends on how feasible an option that is, and how many backlinks you have (are there only 3? then having an optimized URL will probably be more beneficial than leaving the URL unoptimized, even if you cant have the links changed).
As to 'exactly' how much do you lose? I dont think anyone has a definitive answer. But I have worked with websites that 301 redirect almost every page when they migrate to a new platform and the SEO impact is not severe if done properly.
I still recommend mining your backlinks and having their targets chaged (at least for the more authoratative ones).
Hi Dana - Let me see if I understand this correctly:
In question 1 you asked if this would be a duplicate content issue. The canonical tag retains the exact same URL regardless of the search parameter (and resulting search results). Therefore, regardless of the search being made, Google and other crawlers will not index page with a search parameter since the canonical references to the original url (http://www.ccisolutions.com/StoreFront/category/search-return). This means that when Google accidentally lands here http://www.ccisolutions.com/StoreFront/category/search-return?q=countryman it sees the canonical tag and understands that it should not index this page as it is only a variation of the core page.
This would of course be a problem if you actually wanted Google to index every query page. Alternate methods could be to disclude the query parameter in WMT or Robots. But the canonical is built in for you so that you dont have to.
In situations like this I also like to add site search to analytics and block the query parameter so no query pages show up as landing pages.
Definitely to reiterate Dana, need more info about the specifics because yes there could be disastrous effects if not done properly.
Besides meta refresh, there are other things to consider. Probably most importantly is the positioning of the home page in search results for the 'product'. If you are ranking top 5 for your product keywords, and the ranking page is your home page, you may find that when you rework your home page to reflect your brand, you lose that positioning.
If you were moving from one landing page to another (i.e. /products to say /products/buy, it would be a bit different because you could 301 the /products page and officially tell Google and other SEs that you are simply moving the page. You cannot, obviously, do this with the root domain (301) or that would defeat the branding purpose.
I would definitely check positioning and revenue for your money keywords first. Also, once you move the product, i would at the very least have a link in the main navigation of the home page that links directly to the product with the appropriate anchor. If you only have one product, or one product set, i would also encourage you to optimized the URL (i.e. instead of /product it could be /
Still need more info to make solid recommendations.
You can always check by testing in your browser but the best way is to check the header response to make sure the server is sending the proper response (a 301) - your landing pages look good (see below). I use Live HTTP Headers which is a firefox plugin - hers what it tell you:
http://pharmacy777.com.au/our-pharmacies/applecross-village/
GET /our-pharmacies/applecross-village/ HTTP/1.1
Host: pharmacy777.com.au
User-Agent: Mozilla/5.0 (Windows NT 6.0; rv:15.0) Gecko/20100101 Firefox/15.0.1
HTTP/1.1 301 Moved Permanently
Date: Thu, 04 Oct 2012 03:23:17 GMT
Server: Apache/2.2.22 (Ubuntu)
Location: http://www.pharmacy777.com.au/our-pharmacies/applecross-village/
So the redirect is working. The only thing i noticed was that the home page instantly switched to www and didnt even return a 301 so it appears you may have implemented a redirect there outside of htaccess.
If your report is still showing duplicates make sure that its not the trailing slash. Your URLs can be loaded as such:
http://www.pharmacy777.com.au/our-pharmacies/applecross/
http://www.pharmacy777.com.au/our-pharmacies/applecross
The best way to find out if the SEOMoz report is counting these as dupes is to Export the crawl report to CSV (top right of crawl report). Then go all the way to the far right column called 'duplicate pages' and sort it alphabetically. This column will show you all of the duplicate urls for each particular URL row. Lots of times you can find little 'surprises' here - that csv report is priceless!
I agree with Dana - the only thing you would need to worry about is if the link was a direct text link using a keyword anchor that is used abundantly throughout your link profile. Even then, if its an authority site that is relevant this is not something to worry about at all. In fact its probably a great link (regardless of the sitewide instances).
Don't look for a question mark, go to the site's profile and look for an email - it will tell you that Google has found unnatural links. Keep in mind that sometimes these emails come 4 or 5 days after the [penalty has been applied.You may have to wait a few days (unless this happened some time ago).
If it's Penguin you wont get a notification. Others may disagree but the Penguin recoveries Ive performed were almost exclusively due to a higher than "natural" appearance of keyword anchor text links in the link profile. If you check your backlinks in OSE, and find that you have a high percentage of links that all use your money kw, then yo will need to have those backlinks diversified.
You're opening a can with this question!
The efficacy of meta tags is much debated. Most people believe that the keyword meta tag has no effect whatsoever on SEO, and some believe the same to be true for the meta description.
The original purpose of the meta keywords tag was to help Search Engines understand what your page was about. After years of unabashed over-optimization, the tag slowly became less and less of a signal.
The meta description tag is a brief description of the page, and is sometimes used as the description in the SERPs. There are varying arguments on the efficacy of this tag as well, although it can be useful from a clickthrough conversion standpoint.
I'm sure youll get a lot of varying opinions on this one!
The link you referenced has 'www' in it, is that how the link is targeted on your website? If so, its probably the double redirect that is causing the issue. Since WP is set to 'non-www' - every time there is a call for the www version of a url, WP automatically 301 redirects it to the non-www version. There is nothing wrong with this.
Its when there is a call for a 'www' version of a URL that has also been redirected, as the one you cited has, where a double redirect now takes place:
http://www.luxuryhome..../sunnyilses.html
to the 'non-ww' version:
http://luxuryhome.../sunnyisles.html
then from there to the new html file version:
http://luxuryhome.../sunny-isles.html
The header check shows a normal www to non-www redirect first (WP is doing this), and then the 301 redirect that changes the sunnyisles to sunny-isles. Both server responses seem OK so the redirects themselves seem to be working. What you want to make sure of is:
Any internal links linking to the old sunnyisles.html page do not contain 'www'. (And in any event, these links should be changed to point to the new page anyway).
Any inbound links from external sources do not reference the 'www' version.
It would be helpful if we cound see the htaccess file as well.
Marie is exactly correct - the unnatural links penalty and Penguin are two separate entities.
The best way to combat a penguin related traffic drop is to use your SEOMoz tools to mine your backlinks and then sort by anchor. Traditionally link builders would buy a majority of their links with the same anchor text (or close variations). Website that have a majority of their links as an anchor can get hit by Penguin (there are some instances where this isnt the case when brand and exact anchor domains are concerned, but generally this is the case).
If your links were natural, Google would see links of all types in your profile including "click here" and "www.url.com" etc. You should try and decrease your keyword link anchor threshold to below 30% by either getting rid of spammy keyword anchor links, or building more generic anchor ones.
This is a tough one. The worst thing to do is to start with citations and then find out they arent working, and then fix the issue and start all over again (now having two NAPs floating around.
Some businesses use virtual offices in an attempt to rank in different cities or areas. If this is the case with your client, it never hurts to contact that service and explain that you must have a unique suite number. Ive found in some cases they will be quite accommodating.
As for the effects - Ive performed local optimization for clients in this scenario and it still worked fine (and the other businesses using the same virtual office also were in the maps for their keywords), but with constant changes in Local, its risky (in my opinion) to continue without getting a unique address first.
Just my 2 cents!
View the page source (right click + view page source), look in the section. Youll see it -
<meta name="keywords" content="keyword1, keyword2" />
Tip: if theres a lot of stuff crammed into the HEAD section, just CTRL-F and search for ```
(meta name="keywords") without the parenthesis
Hi Dana -
I think in the case of Google Custom Search, there is no need to worry about duplication. The reason is that although the rel="prev" etc tags are not being used, a blanket solution already exists: the canonical tag. As you mentioned, the canonical tag never changes, regardless of the search - therefore the crawlers only ever see the Custom Search page as a single page regardless of the queries being made. Thus there is no duplicate issue.
Pay close attention to what Ben mentions about Local SEO. If you poke around long enough with your keywords you'll notice obvious trends in your area (that may not equate to other regions) particularly with the words "in" or "near".
You'll start to notice how certain prepositions trigger the map packs. It's just something to pay attention to depending on your keyword + locale.
Hi Tung - welcome.
A lot of times when you search for a keyword that exactly matches your domain, Google will show sitelinks, as long as the keyword isnt generic, even if the site is relatively new. Looks like you just got cached today.