Category: Search Engine Trends
Explore current search engine trends with fellow SEOs.
-
Duplicate content penalisation?
Hey Duplicate content is duplicate content whether that is pages on the same site, different sites, different IP's, different C blocks etc. As long as these 'snippets' are not the main page content then having a duplicate bit of text between the blog and the product page should not be a major issue. Point to take away here is that lots of sites use duplicate product descriptions etc, it is better if they are unique, but it will not generate a penalty if it is not the main page content. Go unique if you can but ultimately do what is best for your users. Hope that helps. Marcus
| Marcus_Miller0 -
Penalty or Algorithm hit?
Michael, if you got hit with on the 24th of february, this was the Panda algorithm update. First, if you sure that your content is 100% unique and a high quality site? i would go to http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en&start=800 This thread is dedicated to people that have a high quality site that has been negatively affected by this change. a Google employee will look closer . On the other hand, the stuff you can do to help your site are (this is my opinion, still webmasters and SEOs trying to figure our how they can get out or what are the criteria that triggered the panda update on their site) Trustworthy UI (user interface). your website is old (it looks like an old 1). see if there is a possibility to make a new site built on a robust CMS. site speed
| wissamdandan0 -
Major Slipping of highly important keywords in a few days
A great test on those paid links would be to change the directory listings to point to another website or domain that you own which has been ranking pretty steadily over the past few weeks. If you see a decrease in organic rankings on that site as well, then you'll know for sure that the paid directories were the problem. If the directory listing was cheap, then it might not be worth your time - you'll need to weigh the pros and cons.
| alhallinan0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
Agreed. Domain authority will play a significant role and more authoritative site may get away with it easier than a small one. I speak from what I have seen (not a speculation).
| Dan-Petrovic0 -
High bounce rates from content articles influencing our rankings for rest of site
Does bounce rate affect your rankings? Here's an oldie but goodie: http://me-in-seo.blogspot.com/2009/02/does-bounce-rate-affect-sites-google.html How can you lower your bounce rate: http://searchengineland.com/two-simple-rules-for-fixing-high-bounce-rate-pages-35125 What happened with Google's Panda update? http://www.seomoz.org/blog/googles-farmer-update-analysis-of-winners-vs-losers Note that Chrome collects direct user feedback on bounces. After you hit "back" from a SERP result it lets you block that site. No doubt doing so signals spam to the algo.
| TheEspresseo0 -
Where to syndicate?
Have you tried PRLeap, PRWeb, and Vocus? I've seen tons of articles get picked up by Google news and displayed at the top of the SERPS without too much work using PRWeb, and Vocus can help you find journalists responsible for syndicating content in all kinds of big name channels. You can also try googling creatively with industry-relevant keywords and advanced search operators to find sites that Google itself currently sees as authoritative, which are also publishing guest articles, then try submitting your articles to them. I would also consider becoming a syndicator of content yourself. Google has guidelines to follow to get yourself recognized as a source for Google news (you have to apply), and it may behoove you to set yourself up to syndicate other people's content (and your own). It would give you tons of control, the opportunity to crowd-source content on your own site, etc.
| TheEspresseo0 -
Bounce rate and rankings
So they clearly have ways of detecting bounces across a variety of methods (return to SERP, GA, toolbar). I have seen no evidence that they use this directly as a ranking factor. It seems pretty noisy / easily gamed and also not desperately well-correlated with quality (as sometimes if you are just looking for a phone number for example, a bounce is the desired behaviour). I think they are using usage data of a variety of kinds to measure and improve the algo, but I'm not convinced it factors in directly from bounce rate itself. Hope that helps.
| willcritchlow0 -
How did NexTag.com Survive the Algorithm Update?
The top three gainers from the farmer update was Amazon.com, eHow.com and NexTag.com. There is a pretty complete article on the visibility changes here from SELand. http://searchengineland.com/who-lost-in-googles-farmer-algorithm-change-66173 NexTag specifically is a very old property, going back to 1999. The authority built up from being one of those older properties has to carry a lot of weight. The majority of the losers from the farmer update are content farms, but where we see eHow survive I'd wager comes from eHow being the origin of many pieces of the content that is being farmed out. This, to me, relates to Local Search. Sites like Localeze are the source of much scraped data that other directories use to show addresses, business categories, etc. So Localeze as the source for this data has authority, where the individual pages that it enters data into does not have that same authority, and protection. Duplicate content can be tracked to an origin, the time the post was made, the time Google first crawled it, etc. So in NexTag's case, being the origin for so many things, I'd say it was safe. NexTag is also very trusted by Google, they have had a Google Shopping relationship for years because Google, and this is just my observation, but Google feels they can trust NexTag's system to deliver a viable product when their links are clicked. I'd wager also that NexTag had a much better link portfolio than the losers, so those trying to copy the NexTag model were shot down in favor of the original provider. Again, these are my best guesses based on the data. Hope it helps. Cheers,
| MatthewEgan0 -
Recent changes to suggested search algorithm?
I saw discussion about this on Twitter last night. Justin from Distilled wrote a post that Google is now blocking "scam" in suggested search and instant search. Look through the comments, as some people are still seeing scam, and there are still other terms to look for, such as complaints, pyramid, etc. http://www.distilled.co.uk/blog/seo/google-blocking-scam-keyword-in-autocomplete/
| KeriMorgret0 -
Google changing the casing in SERPs of our domain name in Title tag!
I believe I've seen at least one other report of the case issue, either in the public Q&A or on Search Engine Roundtable. In any case, it's not just you. I'll bookmark this thread, and add any case information to the thread as I find it. For the titles, Google has been doing that for a little while to some sites. Barry Schwartz has written about it at http://www.seroundtable.com/google-title-selection-12989.html.
| KeriMorgret0 -
Selling same procucts from more than one website
I agree with the other two response - why have 3 sites selling the same product? This problem isn't new though. Many big sites have the same problem where they sell the same product as many of their competitors. You should be fine as long as you make each page different to the others. Amazon got around this by giving users the ability to write reviews of the products, which ensured that each page had unique user generated content on the pages and was therefore useful for Google. If you can, stick to one site.
| A_Q0 -
Does Google factor in site traffic, time on site, bounce rate, etc into it's ranking algorithm?
If Google came out today and said "site traffic, time on site and bounce rate are all going to be factors of our algorithm" then ethically-challenged individuals and companies may considering banks of users based in places with low labour costs to try to manipulate the algorithm. It is almost certain that Google has considered this in the past but is worried that people may try and game the system. As such I would expect any ranking element related to the quality of a site to be more complex. There are suggestions that Google is trying to get algorithms to be able to judge the aesthetic quality of a site and that this could be a factor in the future
| CPU0 -
301 redirect question
Thanks Elias for your fast response. You are correct in assuming that the URLs are not optimized & i appreciate your feedback. I totally agree with you about creating websites for both users and search engines, from my experience this is often overlooked.
| Cyle0 -
Which is better for SEO. 1 big site or a number of smaller sites.
It may be that when someone searches for product one that the site product-one-review.com is better than the product one page on the .com site all other things (links etc) being equal. However what happens if the visitor wants to buy product one and product two at the same visit and in the same transaction, would they have to jump between two different sites? Would there be duplicate content on the product one site about product two that was also on the product two site. There are non-SEO considerations to take account of. I would go for the .com site with pages devoted to and fully optimised for each product and then try and link build for those product pages. Hope this helps!
| CPU0