Yes - I can say from experience that this is true. You'll also feel much more inspired to keep working towards 'coat stand', after you've achieved the mini-success of already ranking for 'cheap coat stand'!
Best posts made by AgentsofValue
-
RE: Optimize short tail keyword by optimizing long tail keyword
-
RE: 301 duplicate content dynamic url
It's really going to be a bit complex to get this done right. But based your example above, it looks like you just want to redirect anything with a query string back to the base url.
There's a discussion specifically about that right here:
http://www.webmasterworld.com/apache/3203401.htm
You could start with this code and refine it from there:
RewriteCond %{QUERY_STRING} .
RewriteRule (.*) http://www.example.com/$1? [R=301,L]Good luck.
-
RE: Is Blog Commenting still an acceptable way of linking
You're welcome. Happy to help!
-
RE: Removing Dynamic "noindex" URL's from Index
You could try adding the pages you want to remove to your robots.txt file. Since you're not linking to them, and it's very unlikely that Googlebot will index those pages naturally now, this might be a better way of telling it which pages to explicitly not index.
I'm not really sure how quickly this will trigger Google to remove those pages from the index - but they do reference robots.txt on the actual "Remove URLs" page of WMT ---> "Use **robots.txt **to specify how search engines should crawl your site, or request **removal **of URLs from Google's search results ..."
For that technique, you'd want to add something like this for all of the pages you want to remove:
Disallow: /oldpage1toremove.phpThat should work. If it doesn't, then I would probably just submit the requests through the "Remove URLs" tool.
-
RE: Places listings vs other ORGANIC listings
Hi there,
I've heard about some stories and case studies stating that organic listings seem to be disappearing and is being replaced by local listings. There are some articles about that online:
Is Google plus local replacing organic results?
Are your organic listings being replaced with local listings?
In your client's case, maybe the local listing has not been optimized yet.
You can link your Google + local page to the site as "publisher". This tells Google that the site is the publisher of the profile's content. Relevant contents on the site is also very important as they help strengthen the site's brand. Gathering customer reviews about the business is also very helpful.
Besides, according to SEOMoz's Eye-Tracking Google SERPs, local listings are more likely to be clicked in the search results.
Hope this helps!
Cheers!
-
RE: How to identify orphan pages?
Well, because they are 'orphans', you probably can't find them using a spider tool! I'd recommend the following process to find your orphan pages:
1. get a list of all the pages created by your CMS
2. get the list of all the pages found by Screaming Frog
3. add the two url lists into Excel and find the URLs in your CMS that are not in the Screaming Frog list.
You could probably use an Excel trick like this one:
http://superuser.com/questions/289650/how-to-compare-two-columns-and-find-differences-in-excel
-
RE: Internal linking using exact keywords Bad – Post Panda
In terms of post panda / penguin linking, the best thing is to stick with an honest strategy by observing the following rules:
-
use cookie crumb style links within all pages. These are good for usability, as it allows users to know where they are on your site. They also provide a way of including rich anchor text links on your site
-
use internal link anchor text in a natural way. For most pages, there are several very closely variations of anchor text that fit to a close theme. For instance, in one case, I link to a certain page on one of my websites the following keywords --> free domain appraisal, free domain name appraisal, free domain name valuation, free domain valuation. The page in question is very relevant for all of those variations and ranks well for all of those variations. It just takes a little bit of extra thought to think of these keyword variations, and can ultimately generate much more traffic for the page in question too.
-
cross link related pages with relevant anchor text. There may be related pages within different category pages. Rich in content text links between those pages provide value to your readers and should not hurt you in any way.
-
I think a basic site hierarchy is reasonable to follow. Pick major categories for the top level links, and related topics below each of those categories.
-
-
RE: Google Places and Google+ for business
Hi there,
As far as I know, Google places (Google +) do not have the multiple users function. Google says only one gmail account can manage a claimed Google Places.
An alternative method is transfer your listing between Google places accounts:
http://support.google.com/places/bin/answer.py?hl=en&answer=17104
Hope that helps...
-
RE: Domain Registrar
Name.com, NameCheap.com, Dynadot.com are all good and have reasonable prices. I personally use fabulous.com for most of my domains.
As annoying as Godaddy can be, I was happy to see they now offer two-factor authentication as a way of protecting domains.
I think reputation of a registrar can be a factor - mainly if they mess up something in your account, or don't handle renewals properly, causing your site to go offline.
-
RE: Will thousands of redirected pages have a negative impact on the site?
I've never had a problem on creating a large number of redirects on a site before. It's something that happens quite a bit, for instance if a site is moving to a a site to a new domain or a new CMS, where it can often be very difficult to exactly recreate the same URL structure.
There's no limit to the number of redirects, just the number of hops. If the site had existing redirects in place, you might want to update those existing redirects as well, to point to the new final destination.
-
RE: Bing search results
Hi there,
Bing is all about good, original contents, authoritative inbound links and well structured webpages. If you want to be optimized in Bing, you may want to focus on those.
Here are some useful information on Bing optimization.
Search engine optimization on Bing
Hope that helps!
-
RE: What is the best way to handle special characters in URLs
Hi there,
The special characters on your website URL is changed because URLs are sent over the Internet using the ASCII character-set. That's why the & is converted to %26.
If you're trying to make your site rank, it is better to just simplify or shorten the URL. This is because search engines have problems indexing sites when the URLs contain special characters.
The special characters below are known to be "search-engine-spider-stoppers":
- ampersand (&)
- dollar sign ($)
- equals sign (=)
- percent sign (%)
- question mark (?)
Hope that helps!
-
RE: Worthwhile to have global footer links?
If its organized and really simple like the footer here in SEOMoz (check below), then keep it. If its just 5-6 footer links, I dont think it would take up a lot of link juice.
Plus footer links, (not considering SEO and Google) are indeed useful for site navigation purposes.