subdomain = www.
So what you're probably seeing is a lot of your domains are pointing to www.domain.com rather than domain.com
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
subdomain = www.
So what you're probably seeing is a lot of your domains are pointing to www.domain.com rather than domain.com
There are no automated link building solutions, however SEOMoz does offer lots of education on the subject:
http://www.seomoz.org/beginners-guide-to-seo/growing-popularity-and-links
http://www.seomoz.org/article/the-professional-guide-to-link-building-2011
Many good blog posts in these categories:
http://www.seomoz.org/ugc/category/4
http://www.seomoz.org/blog/category/4
A good list here:
http://www.seomoz.org/directories
And an excellent tool for starting the process here:
http://www.opensiteexplorer.org
You may also find this Labs tool useful:
If I were you, I'd schedule a call and ask questions. Like what the benefits of having your content syndicated across the network is outside of name recognition, how they handle duplicate content problems, how they ensure your content is used properly, etc, etc. Put the onus on them to alleviate your fears. Chances are it won't be that beneficial for you.
With regards to your fear of not providing a URL, this is probably a form mass-email to pre-identified targets that was just badly executed. Call nSphere directly, and ask to speak to him. See how that goes.
I'm looking for opinions on the following scenario:
SuperWidgets buys GreatWidgets. That business acquisition involves the purchase of GreatWidgets.com, a standard but well-established website with some nice backlinks. The business acquisition is communicated to previous customers of GreatWidgets through the normal channels.
What should be done with GreatWidgets.com. Re-direct to a splash page informing visitors of the acquisition? 301 page-to-page for directly relevant content? If a splash page, how long would you keep it up?
Also, any opinions on how to handle any non-claimed local listings for the now-acquired business? Claimed ones will of course be handled, but what about the non-claimed and unclaimable?
For your primary target I've got you lagging behind Yelp x 2, Yahoo Local, and a semi-heavy with a Places page. All in all not bad for a month's work!
Checking on some of your secondaries, you look like you're again fighting directories and Deep-Steam. They appear to have the local advantage, but once you get your Places page and some more citations coming in, I think you'll start to bump up.
Since it appears you're targeting locally, I suggest taking a spin over to getlisted.org, and make sure you're listed in the places they indicate. Give David Mihm's local ranking factors a good read, and put that to use as well.
All in all, you're making it happen, just keep trucking along!
Thinking it might be that I used a hyphen and you didn't, I just re-checked. Now, with hyphen I'm getting rank 19. Without, rank 9.
This is weird. :S
Quite possibly. I'm still getting a not-in-top-50, however. See attached image. :S
Yes-ish. You're going to want to read this:
What is telling you that you have these rankings? Rank Tracker has you NIT-50 for any page on your website.
If this is just you manually checking, there are many reasons you may be seeing your own pages. Search personalization is huge, from rank history to geography.
I'd say use it where it makes sense. Location finder? use it. Contact page? Use it. Footer of every page? Nah.
Assuming this domain move is a recent re-direct from your .co.uk to your .com, and a website previously existed separately on the .co.uk, did you just blanket redirect, or does each page point to its equivalent? If it was just a blanket redir, you may be leaking link juice. Also, to confirm, you're stating that the .com's previous DA was 44, then when you redirected the .co.uk to it it dropped? Or are you saying the .co.uk was 44, then when you moved to the .com it is only 33 now?
With regards to on-page reports, how are you attempting to pull this data? Just in the campaign tool, or are you using the on-page report card tool manually?
If your page does not rank top-50 for a keyword, it will not have an automatically produced on-page report for that keyword in the campaign tool. To recieve reports for non-ranking pages/keywords, you will need to pull them manually using the report card grader under Research Tools.
I've never understood the point of having a meta title tag if you're going to use a regular <title>. Seems a little skeevy to me.</p> <p>To your question, it would not have any <em>positive</em> effect on ranking, since that juice was already supplied via the traditional title.</p></title>
While there is no authoritative answer from Google or Mr. Cutts, general consensus amongst my SEO peers that were just IM'd by me seems to be multiple titles on a page is bad. I agree with the consensus.
Years ago, multiple <title>s was a common blackhat technique.This is back when you could successfully stuff meta keywords and it'd actually work. As for now if it would just be neutral or actively negative...I don't know, but my gut leans to it'd be actively negative to some degree.</p></title>
Not logged in, no location set, I've got you coming in 6th naturally here, just below a block of videos, and well below the fold. Might be that many searches, but you're below what I would consider just-as-relevant (and then Wikipedia, sigh) content. 1% CTR is not bad for slice #6.
I think you should consider query-intention a bit here. What is a user searching for 'kitchens' looking for? It's a very general term, so you can't really tell.
If I were you I'd focus on local terms, ones that were more query-matched to content you have.
So, yeah, I'm agreeing with you 
I'd check with your delivery network. Might be something on their end. Or it might be that my user-agent switcher isn't working properly (it has happened before).
A further hmm:
Xenu 403's, but Screaming Frog doesn't.
Further to Alan, if I manually change user-agent to something unknown, the site 403's. Are you or your host using any sort of user-agent detection funkiness?
I'm in the process of reviewing on-site URL structure on a few sites, and I've run into something I can't decide between.
I am forced to choose between the two examples:
MediaRoom/CaseStudies.aspx (camel case)
mediaroom/casestudies (all lower case, mashed, no dashes)
I would personally rather see:
media-room/case-studies/
However implementing the dashes would require manually re-writing about ~10,000 URLs. Implementing 301s from the existing structure to whatever I choose would be trivial, so there is no concern there.
Given the choice between CamelCase and lower-mashed, which would you choose? Why?
You need to use a rel-canonical tag. You can read a really informative post by Rand on it here:
This thread has some good opencart-specific advice regarding implementation:
I would first determine your reporting threshold. If you're below, say, result 100, I would consider yourself not there at all. Perhaps refine your target keyword list to something more realistic.
Personally, I use NIT50 or NIT30 (not-in-top-*) as my threshold measurements.