Remove this immediately:
Best posts made by THB
-
RE: What is the most optimal URL structure
There are quite a few factors at play here.
1. I've always preferred, as a developer, to have end-pages split up into categories and sub-categories for ease of development. However, it also let's the user know where they are within the site simply by looking at the URL.
There really is no right or wrong. You just have to do what makes sense for the site. If we're talking a micro-site here, with only a handful of pages, then you don't need to create categories and sub-categories. Just make a straight up URL, ie. /vacuum-services.html instead of doing /services/vacuums/
Remember to try and keep your preferred keywords to the foremost left of the URL to ensure some significance is placed on them. Not imperative, but if you can, I'd suggest it.
2. Always use hyphens to break up a word. Underscores are seen as a form of concatenation by search engines, whereas hyphens are seen as separators. Using neither is not recommended as it's not legible to the end-user and ultimately just forms one large word comprised of several keywords. No good.
-
RE: Total Links vs Ext. Followed Links
You are correct in thinking that. Nofollow links are used to say "we do not endorse this external reference", which means no PR value is passed (although, Google still reserves the right to actually pass some PR if it feels the external reference is worthy), but the external site will be crawled. Nofollow does not stop the bots from accessing the referenced URL.
With that said, you need a proper ratio of dofollow to nofollow backlinks to keep your incoming links looking organic. For example, if 99% of your backlinks are dofollow, Google might see that as being a little fishy. The biggest mistake people make is to not go for a backlink just because it has a nofollow tag on it. With all the social media out there these days, and the large majority of them applying nofollow tags on all external url's, it doesn't make sense to not get your links out there regardless. Google still sees these backlinks and recognizes them as a reference to your site/company, and there is a large signal in their current algo for that.
So don't dwell on it, trust me. You will find you're just investing too much time for something that you cannot control. Just focus on creating relationships with relevant sites and the rest will fall into place. Pay no attention to the nofollow tag. Take it from a guy who used to be OBSESSED haha. I read more source-code than actual page content for years!
End note: when it comes down to it, to help put things in perspective, consider a _nofollow_backlink to your site coming from wikipedia.com or something like that. And then consider a dofollow backlink coming from some no-name, or less reputable site than that. You're going to see a significant increase in popularity from that one nofollow backlink than any other dofollow backlinks, no what I mean?
-
RE: Thoughts on Google+ influence on SERPs?
A step in the right direction for whom, Google? Of course. But not necessarily for the end-user by any stretch of the imagination.
To be honest, my care for Google, it's products, it's advice on SEO, and so on, have completely sizzled over the last year or so as they continue to practice the very black-hat techniques that us webmasters get in sh*t for. Sorry Goog's, but I won't use your second-tier G+ anytime soon, that's for sure.
Even Google's search has lost its relevance for me as they're opting to give more SERP real estate to big name brands (which is just a nice way of saying that they're giving more SERP real estate to companies that spend millions in AdWords, let's not kid ourselves here). Just because a company has a recognizable brand name, and spends millions on advertising, doesn't necessarily make their product any more relevant, or of better quality, than the little guys.
To the original post... of course G+ directly influences the SERP's. Do you think for a second that Google would have it any other way? Like I said, they are desperate to get people using their Social network, and this is one way to at least get webmasters involved.
Side boob: Google should re-focus their Google+ into a business oriented social network. Their reach does not extend to half of FB's user-base in that your typical, non web savvy (ie. my Mother) is not ever going to use Google Plus, so why market it to them. They're lucky if they have a FB account, and that's as far as they'll go because their entire family is already setup on it. These are the people that actually click on the adwords sponsored ads at the top of the SERP's, even thoughm the majority of the sites in adwords are irrelevant to the search term in question (at least their landing page is).
Watch for more Google (in)direct user-influence tactics coming soon... too bad for them it's race they lost the day Mr. Zuckerberg bought the Facebook.com domain name.
-
RE: Www v.s non www
I worry about setting up a canonical tag that points to a URL Google can't access (as it's just being redirected via 302 back to the non-www version anytime it will try and read the canonical URL). And since a canonical tag is kinda sorta like a 301, you'd ultimately be 301'ing (kinda sorta) back to the www version, only to have a 302 header sent, 302'ing Google back to the non-www. And endless loop, so-to-speak. I'm not sure how Google would handle this.
How about just working 24/7 to resolve the "technical problem" that is causing this? I know, easy for me to say

-
RE: What do Bing and Yahoo look for in a site?
Well, to begin, Yahoo search is now run off the Bing algorithm (algo). So while there may still be a "Yahoo Slurp" crawler out there, it's based on a different algo than once before. Bing now completely runs Yahoo search.
Search engines have their own algorithms. There is no specific algo that they all must adhere to. So while rankings for your site might go up in one engine, they might very well go down in another (or not move at all).
And I can assume Bing watches for black-hat SEO tactics, although I don't have any physical data to back that up. But it's safe to say they do.
Huge mistake website owners make is to optimize their sites for Google only. Google only makes up 65% (?) of the search market, so by optimizing for Google, and Google alone, you're cutting off a potential 35% of traffic.
There is a ton of forums, documentation, webmaster tools for Bing, just as there is Google, so you need to put in that extra effort to see what makes a site rank well in Bing.
As long as you stick to the fundamentals, ie. proper internal link structure, attain solid/safe, relevant backlinks to your site, use your Webmaster tools (and SEOmoz ;)) to make sure site errors and such are taken care of, and get your HTML error free with proper H1-H6 tags (where applicable)/title tags, meta tags, etc., then, and only then, should you start tweaking your site for direct optimization for each engine.
-
RE: Thoughts on Google+ influence on SERPs?
Hey, leave my mama outta this

What I'm saying in regards to that, and I thought I was being quite clear, is that Google would stand a much better chance of dominating the social networking niche if they re-adjusted their priorities, and lost the boner they have for conquering Facebook. Unless they can figure out a legitimate way of allowing people to copy their entire FB profile over in one click, they won't ever be able to grab the entire, existing, FB user-base. It just won't happen. People have invested waaaaay too much time uploading thousands of photos and videos, engaging in countless conversations/emails/messages, and creating their network of friends and family. I'm just saying that their initial thought process of trying to convert people was hopeless from the get-go.
I don't disagree that they might be on to something in terms of the future of social networking; however, for every new idea they add to G+, FB can easily integrate the same idea to their site and they're back to being even. The same way Google does to every little competitive company that is even but a spec of dust on Google's radar. Google leaves no room for competition, so why should Facebook?
For the record, I could care less either way. My days of being over-actively involved in my own personal Social media have come and gone. And I offer both solutions to any clients that inquire.
Oh, and, I do quite well in the SERP's, actually. Google, Bing, and so on. I've seen a ~500% increase in traffic over the last 2 months to several of my websites, so let's not go there.
Come on now... Google has been caught a handful of times doing the very things they penalize websites for. Case in point (and these blackhat tactics are as recent as this past week!):
http://www.seobook.com/post-sponsored-google
http://www.seroundtable.com/google-caught-for-paid-links-14539.html
I could post many more resources/articles to other's they've done in the past, but they're be no fun in that

Their shady tactics don't stop there, however:
http://www.electronista.com/articles/11/07/25/google.street.view.now.known.to.have.seen.devices/
Just because I don't use Google+ personally, doesn't mean it's not offered to any clients of mine. But the reaction of theirs is overwhelmingly the same: "Ugh, another social network? When is it going to stop!?" in reference to FB, Twitter, G+, LinkedIn, and so on. 'Cause you can't just replicate your content over them all to be successful, so that's where the "Ugh" comes into play.
-
RE: Www v.s non www
Hehe.
Generally speaking, and I've actually come across this quite a bit lately, it's better to just put your efforts towards fixing the technical issues than to try and manipulate the site using redirects and canonical tags. But it's easy to say when it's not my technical problem, nor my money/time on the line to fix it! However, that is always the best-case scenario in my opinion.
-
RE: Improve CTR with Special Characters in Meta-Description / Title Tags
Well, I don't think there is any denying that using special characters/symbols help anything stand out more. And while I cannot help you with any definitive answers, as I have not run any case-studies myself, I can tell you that people are becoming more sensitive to spammy looking sites and such within the SERP's.
With that said, if you choose to use any special characters within your title/meta tags, tread lightly, as preceding your actual site/page title with 5 moons or stars might look a little fishy to some.
But I agree, as an end-user, your eyes are definitely drawn to things that stand out first and foremost.
Last thing you'd want is to be ranked for a symbol by the major search engines.
Sorry, that probably not much help... just my 2 cents on the matter for what it's worth.
-
RE: How to Create automate Content for Big Ecommerce Site
Agreed.
"I dont want to write this differently on each 1k [pages]"
But why not? It's very time consuming, yes, but will pay off exponentially in the long-run.
And then once you're caught up, writing those descriptions for any additional product that comes in will be abreeze.
That's the best-case solution; but to answer your concern directly, yes, there is a good chance that search engines will just disregard each write-up as actual content if it's in a template fashion with only a word or two swapped out.
-
RE: Www.mydomain.tld same as mydomain.tld - unexpected results
To handle this you need to setup a permanent redirect from mydomain.tld to www.mydomain.tld (or vice versa).
If your site is running Apache you can use a .htaccess file do handle this.
Simply create a file called .htaccess (make sure when you save the file you are not saving it as .txt as that with append the .txt to the file like so: .htaccess.txt which you do not want; save as 'All Files' or whatever), and save it to your public_html directory, or whatever your web root is.
Add the following:
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} !^www.mydomain.tld$ [NC]
RewriteRule ^(.*)$ http://www.mydomain.tld/$1 [L,R=301]
That should now effectively keep your site from being broken into 2.
Google also recommends you create two(2) Profiles within Google Webmaster Tools, one for the www. and one for the non www. where you can then specify under "Site Configuration > Settings > Preferred Domain" which method you have chosen as per your .htaccess file.
Hope that answers your question.
-
RE: Have we been penalised?
Agreed. Search engines geo-locations tools are based off IP address, so if your work, for example, has a server located in another country, that could throw off your results. Take my server at work, it's based in the US while I'm in Canada.
I usually check things like that on my Blackberry to get a non-biased result (usually).
Another thing to keep in mind is if you're logged into a Google account your results could also differ as Google will try and serve you biased results based on your browsing preferences. You can also append &pws=0 to any given Google search results URL to disable personalized results, ie. http://www.google.ca/search?aq=f&gcx=w&sourceid=chrome&ie=UTF-8&q=property+in+spain&pws=0
So, logout and append the &pws=0 as shown above to ensure you are getting the most unbiased results possible (of course, you're still going to be geo-targeted by Google, but unless you have several proxy's, there's nothing you can do).
-
RE: Pagination solution
Not all web crawlers honour the rel="prev" and rel="next" attributes, but I always use them because they cannot harm you and are especially helpful for crawlers that do take them into consideration.
I made the mistake, ages ago, of placing the canonical tag on my pagination pages that pointed to the first page. I didn't have a firm grasp of the canonical tag at that time, and i paid the price for it. Now I find that the canonical tag is grossly over/misused as you don't even need to place it on any of the pagination pages. Google knows what page it's on and will usually just disregard the canonical tag. It will only take it into consideration if the URL and canonical tag don't match.
Make sure to change up your title/meta tags to accommodate the various pages, ie.
<title>Car Parts - Page 2/3/4/5/6/etc</title>
Adding a page reference to your <h>tags is not necessary as the content of the page is still the same, just another page.</h>
Consider adding the title attribute to your paging links as well as a notifier:
There are additional rel attribute values that can be helpful, too: http://www.w3.org/TR/html4/types.html#type-links
-
RE: Rel="external" What affect if any does this have on SEO
"rel="external" does nothing except notify the browser to open a new window, like target="_blank". The only difference is that rel="external" is xhtml valid, and target="_blank" is not."
Here is a good discussion on the subject... a little dated, but still holds true: http://forums.digitalpoint.com/showthread.php?t=61308
-
RE: Www to non www
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.(.*)$ [NC]
RewriteRule ^(.*)$ http://%1/$1 [R=301,L]
Put that in your .htaccess within your root web directory (typically public_html), and you should be good.
Make sure you save your .htaccess file properly (save as "All files" and not "as text file" so the .txt doesn't get appended to it.
-
RE: Sugestion....
Well, what are your concerns?
I mean, you're using Wordpress, so the majority of the SEO structure should be taken care of. What keywords/phrases are you looking to rank for?
And I believe you have misspelled habitat wrong in your logo. You have 2 b's when there should only be 1
-
RE: How can it be possible to get 404 Errors on URLS here in SEOMOZ, but if i try to load them in my Browser they load...
When I load up "http://shop.samson.de/index.php/products/hardwarepaket.html" into my browser it returns a 404 page:
404 - Seite nicht gefunden
Leider konnte die angeforderte Seite nicht gefunden werden, scheinbar wurde eine nicht korrekte URL eingegeben. So ist der 404 Fehler entstanden. Bitte nutzen Sie unsere Suche oder wechseln Sie zu unserer Startseite um in unserem umfangreichen Sortiment zu stöbern.
The page might load up, technically, but it's returning a 404 Page Not Found header response, which is why crawlers, like SEOmoz, are displaying it as a page not found.
This is expected, and an acceptable practice, so I would not worry (unless that page is supposed to have content, then you had better check the code for any errors).
-
RE: Website stuck on the second page
You must look at ranking in the SERP's as a popularity contest, so-to-speak. Backlinks by way of websites and social mediums are votes and can/will determine your placement in the search results.
Your chosen keywords/phrases might be extremely competitive, as well. When starting out with a site, it's best to select some niche, lower volume keywords, which are often more long-tail than the aggresive, highly-competitive 1-3 word keyphrases. Trust me, going for the gusto right off the bat will only leave you more and more frustrated as the years go on. You can always adjust your keywords down the road once you've established a good roll of backlinks and visitors to continue being more competitive.
And as I mentioned in an earlier post today, don't just focus on Google. There are other search engines out there as well, and you'd be making a large mistake in focusing solely on Google for SERP's as Bing/Yahoo holds a decent market share, too.
Keep trucking. SEO is an art-form. It's not something just anybody can do on a whim and expect to get to #1 overnight. It might be wise to invest in an SEO team to help get you pointed in the right direction.
- Marc
-
RE: How to compete with spammers
I wouldn't worry (easy to say, hard to do), because if their content and such is as garbage as you say, it will catch up with them.
Google (and other search engines) track bounce rates on SERP click-through's. Meaning, if a user clicks one of your shady competitors in the SERP's, quickly realizes that the site is no good (ie. scraped, useless content; littered with ads, etc), then they will quickly come back to the SERP's and continue trying other sites. Google tracks this with cookies and uses it to determine whether a site is useful or not, and deserves the spot in the SERP's that they currently have. I believe there's a 10 or 30 second 'bounce back' window where if the user clicks back to the SERP's within one of those 2 timeframes (if someone can confirm 10 or 30 seconds), then it's promising for you (assuming your bounce rate is low compared to theirs). However, these things take time to iron out, especially if these sites have a ton of backlinks.
"on page optimization is superb"
Care to share your site and let others help be the judge of that? Second and third opinions can never hurt (usually

Only if you want.
- Marc