Seems that way, but therein lies the value and appeal.
Harder to get, but more valuable when you do. It'll be equally as hard for your competitors and others, so if you can crack it, it's a great string to your bow.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Seems that way, but therein lies the value and appeal.
Harder to get, but more valuable when you do. It'll be equally as hard for your competitors and others, so if you can crack it, it's a great string to your bow.
Going on the 2012 Searchmetrics ranking factors report, the correlation in data would seem to suggest that 'shares' carry more 'value' than 'likes:
http://www.searchmetrics.com/en/white-paper/google-ranking-factors-us-2012/
The data is coming up on a year old now, but I can't see why it would have changed this year. I would point out, however, that this an aggregated score and it is a correlation, rather than any quantitative evidence. Results in your niche may vary, but it's a pretty decent indication.
You'll also need to be aware that you can't really force a user into one sharing/liking output - let them do as they please. An A/B test in share banners containing facebook likes and facebook shares might be interesting to run.
It's my understanding that the Googlebot can read this text, regardless of .css styles. You can actually check this yourself by putting in the page URL on this website (do a simple search for a free report). That browser fetches your page as the Googlebot would see it, so you can see if the content is read by Google or not.
Now, as for whether or not this might be deemed duplicate content, I don't think you have much to worry about as you have already taken necessary steps to prevent any penalty. Implementing the canonical tags that you have will tell Google that any duplicate content there is for a user reason and is not trying to game the system.
Provided those tags remain in place, I think you'll be fine. A problem may occur if you're building outbound links and hiding them using display=none or other .css styles. This is a big no-no and can get your site deindexed if Google finds it. Always worth bearing in mind for people on your team, but it looks like you've got everything under control!
It's a bit of a grey area, that's for sure. There's evidence to suggest that it works, but you'd often be taking a shot in the dark.
There's a whole number of things to consider: has the domain been dropped, has it had a penalty, has it got a number of low quality, spam links? A few of these things you can get some idea of before any purchase, but with other factors you're taking a stab in the dark. That's not something I want to do with my SEO - I want control and accountability.
You also need to ask yourself how exactly is this building your brand. Buying domains and redirecting them is a classic case of chasing the algorithm; optimising for a search engine and not a user. Again, it's not the sort of practice I'd want to engage in, regardless of whatever benefit it might provide. Besides, if your website ever falls under manual review and a Google employee sees 100s of domains redirected to your site from a whole host of unrelated industries, I can tell you that it's not going to look good for your website.
Links/Resources pages can be, under the right circumstance, a great place to be linked from.
Now this is dependent on a number of factors. A website that has a resource page that contains links only from within their niche/area of expertise, on a website that you can be sure has not built dodgy links itself, then there should be no harm in having a link there. Even if the page has a lot of outbound links, if it's a reputable website then your link will still be worth something, although might be diluted due to the other links.
Of course, a site that has one link pointing to pet supplies and another linking to viagra on the same page is going to raise suspicion. Similarly, if the resource page sits on an authoritative domain but the PA score is nil or very low, then it may not be worth pursuing. That's why I'd also approach the issue from an issue of branding. I'd think to myself "Would a link to my website from this website come across as authoritative to the user?" That is to say, if someone was on a user journey to buy car parts in michigan, and on the government resource page a link to your car parts website is there, that's going to be a ringing endorsement for your website in both the eyes of the user and a Googlebot.
In many ways, you could approach links/resource pages like web directories. If you have the opportunity to get a link from an authoritative one on a solid website, I can't see it doing much harm.
Dan Shure wrote an excellent SEOMoz blog post about optimising wordpress earlier in the year. It really goes into all the detail you could wish for, so rather than repeat it, I'd rather just post it!
Having just quickly reminded myself of the post, it provides you with the answer I think you're looking for.
I think you're certainly looking at a duplication issue here.
Bearing in mind I don't know how your site generates pages, there's a couple of things you could do. First of all, I can see that you have a rel=canonical system in place, which is a good start. One option would be to choose the one version of the URL you'd wish to prioritise and then change the canonical on the other version to point to that page. For example, you might want to change the canonical link on -> http://www.pakwheels.com/used-cars/search/-/mk_Toyota/md_Corolla/ to <rel="canonical" href:"<span="">http://www.pakwheels.com/used-cars/search/-/mk_Toyota/md_Corolla/"></rel="canonical">
Alternatively, and what I'd recommend, would be to 301 redirect the URLs to your preferred address, as the pages are serving no extra purpose for the user. Implementing a 301 will stop Google from crawling those pages and therefore flagging up any duplicate content issue.
There's a few solutions here. If we rule out any chance of deleting these pages (would deleting them effect your ability to create products from those manufacturers in the future? Not sure what system you're going with here), you can take steps to limit any adverse effect this could have.
First, you could add the following code to the head tag of your HTML:
This can be easy to implement, but robots and crawlers have been known to ignore this directive.
Alternatively, you can add text to your Robots.txt file, telling crawlers to ignore certain pages. In this case, you might want to add:
User-Agent: * Allow: /
Disallow: */littermaid-manufacturer
You'd want to do this for all of those empty pages, but this could take time and can get quite messy if there's dozens of these pages you want to block.
I'd highly recommend implementing a rel=canonical system onto your site (I would even if you didn't have this problem). This schema language can help to reduce any negative duplicate content effect. SEOMoz provides a detailed canonical guide here, which I recommend reading through.
Finally, as you've mentioned, adding content to these pages would also be a good idea, regardless if they're in use. If you had a detailed description, say roughly 300+ words (very broad figure, go as detailed as you can), it would dilute any duplicate content elements on the pages. In addition, it will be useful to have on your page if and when products are eventually added to them, which can be very useful if you wish to optimise the page itself. Put it in the first paragraph under your topnav for now and you can always move it later.
Hope this helps - to summarise, I'd recommend implementing rel=canonical on your site, adding content to those pages and, if you want them blocked from crawlers completely, add the pages to your robots.txt.
Hi Carin
Most likely you'll be required to pay a fee to acquire the use of such images. Getty images provides its royalty free images online, if you're interested.
A more affordable option, especially for startup blogs and authors, is to download free, royalty free images. There are a few sites that offer this, such as http://www.sxc.hu/ and http://www.freedigitalphotos.net/
Google also allows you to search for images that are free to share or use. If you click on this advanced search query for 'apple', you can see at the bottom of the page a setting for usage rights. Set that to 'free to use or share' and you'll get a bunch of images you can use.
Hope this helps!
No problem, glad to have helped and hope it's not too much of a frustration for you!
This is something that was first talked about around February 2011. SearchEngineWatch provided a great insight into how and why it was happening, which I'd highly recommend reading through.
Great answer above me, so I'd only add another resource to finding guest post opportunities.
Stoked SEO's link building query generator utilises all the different google search terms you can use to also find posting opportunities.
Don't forget to look at related blogs on sites like Technorati as well, and approach them through their contact forms.
It's certainly a safe site.
As with all article sites, its value has been diminished significantly. As far as article sites go, SelfGrowth is probably one of the better ones out there. I know when I've had a surplus of content on a previous site a few months back we used it. Rankings did increase, but it's not like we used SelfGrowth exclusively, so impossible to say there's correlation.
A link from there wouldn't likely hurt and might pass on a bit of 'link-juice', for lack of a better term. However, I'd only ever use it if you're overflowing with time-sensitive content that you don't have anywhere else to put. Other than that circumstance, I'd nearly always recommend doing some targeted outreach and getting your article onto a related blog in your niche.
It's pretty hard to give a 'right' amount here.
Of course, it's well documented that more content on a page has a strong correlation with improved rankings (and conversions). To say that there is a golden threshold of characters, however, is impossible to say.
I'd rather bring up the point you make about stuffing. That's probably the main thing to keep in mind when writing descriptions or content - don't make it look like you're gaming for a search engine, but keep it great for a user. If you can use your keyword multiple times, that's great. But, as you allude to, writing it for the sake of getting it on the page more often is a bad move.
If 160 words for a description is the absolute most you can say on a topic, without repeating yourself, then 160 is the right amount **in this case. **Other times it might be more, and sometimes it might be even less; it really is dependent on the context.
You might be able to squeeze more content for a description by using things like an example of how a system/process works etc. But I'd always remain focus on writing for a user, not a search engine, and to avoid stuffing where possible, as you rightly pointed out.
Howdy
You'll need to block the SEOMoz crawler in your robots.txt file. Edit the file and insert this bit of code:
User-Agent: rogerbot Allow: /
Disallow: /forum/
If that doesn't work, you might want to add a Wildcard too, making it:
Disallow: *****/forum/
If you're not too sure about editing the robots.txt file, then it might be an idea to check the SEOMoz guide: http://www.seomoz.org/learn-seo/robotstxt - if you're really unsure, it's best to get someone else to do it for you, as you don't want to block all of your site to all of the crawlers
Hope this helps