Out of interest, I tried Bing and the searches failed.
One more reason to add a plain text version in the noscript tag.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Out of interest, I tried Bing and the searches failed.
One more reason to add a plain text version in the noscript tag.
Could be. They could also be linked to on those phrases from other sites.
So I tested a different string from both menus:
"goat bucheret, carmody, dry aged jack, pt." -> success
and
"Satur Farms Green Salad" -> success
Perhaps you can confirm with your own test but it appears the claim is true.
However as a back up, it couldn't hurt to include no-script content since that's literally the purpose of the tag. Just remember to maintain the content.
If they claim it, ask them to back it up with a real example or two. Then copy what they did (ie noscript link perhaps?)
You would have to employ the GWMT's Change of Address feature to not have link equity dissipate through the 301s, no?
Another tactic one client used was a website credits page. It had full priority content dedicated to the authors/designers and developers of the site. This link could be treated as higher priority over small footer text. Bonus, the page would likely be more topical to your business. Not for everyone however.
Given the spec, a 410 for truly gone pages will be better regardless of reporting or Google.
http://googlewebmastercentral.blogspot.ca/2011/05/do-404s-hurt-my-site.html
Sadly, from that page: "Currently Google treats 410s (Gone) the same as 404s (Not found), so it’s immaterial to us whether you return one or the other."
This is direct non-compliance which I can only guess is due to people unwittingly use the code incorrectly. Then again, they have allowed little errors to completely wipe out a site before.
There has been a suggestion from other engineers that it will reduce the number of Googlebot retries and 410'd page will take longer to re-index if they do reappear. http://productforums.google.com/d/msg/webmasters/i70G2ZAhLmQ/neKEH4spacUJ
EDIT oops, your question. Seems Google reports all the errors it finds but I personally can not attest to seeing a 410. Here is the list of errors in the GWMT help section: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=40132
410 is on the list.
If the descriptions are very technical then likely there is a fair amount of repetition in the sentence pattern, diction etc. I'd recommend playing with regex to help transform content into something original.
For instance, you could search for industry abbreviations CW and replace with long forms _**Clockwise (CW). **_Maybe they over use an adjective that you could changeto your own voice.
Also, perhaps the stock descriptions have blocks of useless content you could strip out in the mean time?
The DB probably has a few other fields (name, product attributes etc) so be sure to find a unique way of assembling the meta description, title and details.
If you find enough to change, I'd think having the description would be better then having a page that is too light on words.
Be sure to mark up with http://schema.org/Product so SE's understand the nature of the content.
EDIT: I have used the regex technique to enhance the content of a database by added inline tooltips, diagrams or figures and glossary links. However with Penguin, I would be careful with automated links. You would only want to create a handful using the same anchor text.
EDIT2: I forgot - MAKE FREQUENT BACK UPS. Regex is super powerful and can tank a database really fast. Make a backup of the original and of every successful iteration - it will take a little longer but it will save your butt when things go bad.
A no-follow, in terms of juice, would actually hurt your goals as the link still gets allocated the juice portion but it doesn't flow through. **Each no-follow link will siphon off a little juice. **
See http://support.google.com/webmasters/bin/answer.py?hl=en&answer=96569
The effects of navigational links are diminished somewhat as Google treats them differently compared to content links. To help solidify this, surround the footer with
<nav></nav>
tags.
Review: http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links #5
Generally, remove any site wide links that aren't always needed and place them on page where users would like the details. For instance, use a search form instead of a link to the search page.
Agreed, the extreme repetition of the brand keywords and anchor text was one of my first arguments for dropping the section.
Think, from everything I've read so far, there appears to be an additional juice loss at one point but it would highly dependent on the trust of the page and the nature of the links. Certainly not a strong enough correlation to make part of my case however.
The question come from a circumstance where 100's of links are contained in a supplemental tab on a product detail page. They link to applications of the product - each being a full product page. On some pages, there are only 40 links, other can be upwards of 1000 as the product is used as a replacement part for many other products.
I am championing the removal of the links, if not the whole tab. On a few pages, it would be useful to humans but clearly not on pages with 100s.
But if Google followed them all, then conceivably it would build a stronger "organic" structure to the catalogue as important products would get 1000's of links - others only a few.
Whatever value this might have, it would be negated if juice leaked faster after 100+ links.
From Matt's article above, "Google might choose not to follow or to index all those links." He also mentions them being a spam signal so I think it still wise to keep them low even if the 100kb limit has been lifted. Clearly there are still ramifications - a concept reinforced by this site's reports and comments.
To my question...from what both of you have said, it doesn't appear there is strong evidence a very high number of links directly causes additional penalty as far as link juice is concerned.
For the record, I'm not calculating PR or stuck on exact counts - my focus always starts with the end user. But, I'd hate to have a structural item that causes undue damage.
The context is a parts page where potential hundreds of link could be associate with other parts the item fit. I looking to firm up my argument against the concept so I want to understand better the true impact of the section.
If it was accelerating the decay of link juice, all the more reason. If not, they may actual help certain products appear organically stronger (i.e. a part that fits on a greater number of products will have more incoming links).
Navigation is actually quite tight (under 20 links) by modern standards.
I used 'PR' mainly because 'juice points' sounded stupid.
I'm more interested in what happens past the ~100 links.
Does the remaining juice get reallocated or does the page leak at a higher rate?
As to rich snippet mark up, I recommend starting with:
http://schema.org/BedAndBreakfast or
http://schema.org/LodgingBusiness
The rest of the question seems to be more about Google Sitelinks and Authorship.
Site Links are up to Google and you need a certain amount of traffic before they will appear. When they do, their based on linking structure and page popularity. The only thing you can do is tell it what pages you don't want in the list.
Authorship may not apply here but there are many good articles about it on SEOmoz.
http://www.seomoz.org/ugc/google-authorship-and-the-fast-track-to-better-rankings-a-case-study
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration.
Correct?
If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links?
IE
After that, the juice is just lost?
Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
Whoa, I should clarify.
I was not talking about news but the tons of other "entertainment" shows that have all sorts of intrusive sponsors. Even if money isn't exchanged, there are tons of ways to "pay" people in media.
I too wouldn't suggest taking money for back links directly but that doesn't shut the door on paid content. Again, back to the sponsorship concept where the high value content is made possible by taking money from a company that then gets credit for it.
I respect your attempt, Diane, to compensate authors more directly but through a piece rate method still. Cracked uses the concept quite successfully: http://www.cracked.com/article_19955_we-want-to-pay-you-to-write-us.html
I don't think swapping GA accounts would be the vehicle I would use however.
Sorry, I didn't see where you mentioned privacy but that's fair.
That is far from a universal rule. If there is a mutual benefit, then the barter system works just fine.
The same thing happens on TV shows: sometimes the story is sought after and the guest is paid, other times the publicist pays the show for the exposure.
Back links are a currency of the Internet (even if it annoys Google) and I would be happy to provide high value content to the right site for the links/exposure.
What if they just want to use a pen name to protect their privacy? Hard to remember with modern life being so exposed and annotated but there are lots of people that prefer to remain anonymous.
So long as the persona is well developed, I wouldn't have a problem with it given the content was high quality.
I would prefer the link only because it's simple and fundamental to the Web.
What happens if you decide to drop GA or their account gets banned? Too many things taken out of your control there. The poster too, has to trust you will always use their account which isn't going to be an easy sell.
As to quality content, what topics?
Two weeks is pretty short time for a new site to get accurate reports from GWT. The back links I found weren't valuable - none with a page authority over 1.
I would secure at least one high quality link and wait a few more weeks.