Questions
-
Does 301 vs 302 matter when dealing with "social signal"?
Hi Robert, Jonathan has said that nothing will be passed as parameters, but even if they were, I don't see that there would ever be a _penalty _for this (in the true sense of a penalty - algorithmic or manual penalisation for spam). You could flood Google with a million query strings and no canonicalisation if you did it badly which could get dangerous, is the only thing I can think of and even then, this would be easy to fix with canonicalisation on your own sites.
Intermediate & Advanced SEO | | JaneCopland0 -
Why are bit.ly links being indexed and ranked by Google?
Given that Chrome and most header checkers (even older ones) are processing the 301s, I don't think a minor header difference would throw off Google's crawlers. They have to handle a lot. I suspect it's more likely that either: (a) There was a technical problem the last time they crawled (which would be impossible to see now, if it had been fixed). (b) Some other signal is overwhelming or negating the 301 - such as massive direct links, canonicals, social, etc. That can be hard to measure. I don't think it's worth getting hung up on the particulars of Bit.ly's index. I suspect many of these issues are unique to them. I also expect problems will expand with scale. What works for hundreds of pages may not work for millions, and Google isn't always great at massive-scale redirects.
Intermediate & Advanced SEO | | Dr-Pete1 -
How to best handle expired content?
As Jeff pointed out, with something like a Christmas category page, you'll want to leave it up so that you can use it easily next year (and maintain any links associated with the page). You can easily remove navigation links to the page during the off season and add them in in the months leading up to the season so that you're again passing internal link equity to the pages. Regarding the RSS feed content, I'd probably 301 it to a related video or to a relevant category page, though you could make a case for leaving it up, saying that the content is no longer available and providing related videos for the user to navigate to.
Intermediate & Advanced SEO | | GeoffKenyon0 -
Does using data-href="" work more effectively than href="" rel="nofollow"?
I think this is actually a really good question. The main reason most SEOs these days don't "sculpt" or "shape" with nofollow links anymore has to do with the fact that they will still take away from the total amount of pagerank available to be passed on to other links on the page. So the question I'm reading above seems to be: Do<a data-href...="" links="" still="" take="" a="" portion="" of="" pagerank="" away="" from="" the="" total="" pr="" available="" to="" be="" passed="" on="" other="" same="" page?<="" p=""></a> <a data-href...="" links="" still="" take="" a="" portion="" of="" pagerank="" away="" from="" the="" total="" pr="" available="" to="" be="" passed="" on="" other="" same="" page?<="" p="">My answer is "I don't know" but I'd like to see a test if anyone can think of a way to try it out.</a> <a data-href...="" links="" still="" take="" a="" portion="" of="" pagerank="" away="" from="" the="" total="" pr="" available="" to="" be="" passed="" on="" other="" same="" page?<="" p="">However, even if the test came back saying "No, these are treated differently and do not currently affect the total amount of PR available to other links on the page" I still would not use it for the purpose of pagerank sculpting. The reason is that how Google treats these links today can change tomorrow, making "tactics" like this a bad idea IMHO. It just leaves a mess for either you or some other poor SEO to cleanup later. If I don't want pagerank to pass through a link on a page I simply don't put the link on the page. In extreme circumstances where there is no other way around it I might consider obfuscating the link with some javascript, for instance. However, even if you block the .js file that handles this "link" in the robots.txt file Google still executes it (as you can see when viewing the cached version on Google for pages that do this).</a>
Technical SEO Issues | | Everett0 -
How to know when do use singular vs plural in anchor text and on-page copy?
A lot of results for singular/plural and synonyms are so similar as to be nearly identical for the first page or two, which is what really matters, and which is what Gregory Baka is referring to. You will notice a lot of times if you search for something you'll see synonyms and variants bolded in the description and title in the SERPs. That would be your signal that one is being treated as synonymous with (though not "identical to") the other. In terms of singular vs plural I tend to include both variations naturally within descriptions and on-page copy. External links tend to contain both versions too unless you're buying the anchor text. I would think, based only on common sense and experience, and not any quantifiable study, that Google looks for a natural variation. If you have two different landing pages, one targeting singular and the other targeting plural, that would not only be wasting effort, money, link equity, etc... but it would seem very unnatural. If I were writing an algorithm I'd probably figure out a way to push such pages lower in the results unless other signals point to really high quality at the page and/or domain level. ALL of this "common sense" stuff flies out the window though when any ambiguity of intent or results is involved. For example, with "cars" you could be talking about the animated movie, which is why you see IMDB, Disney and Wikipedia in the results. This disambiguation factor is why Google is pushing for semantic markup of the web, and is probably why topic modeling has become increasingly important (e.g. want to rank better for "cars" when the user intent is to find the animation, use words like "Pixar" and "Lightening Steve McQueen" in the copy). As a rule of thumb, I tend to go with whatever sounds better and makes more sense to the user. For example, on a category page I might write "blue widgets" in the title, but I'd use "blue widget" on a single product page. From there I go with what the data says. Looking at Analytics a few months later I pay attention to traffic and keywords as a follow-up. If the "blue widgets" category page gets 80% of it's traffic from a #3 ranking for "blue widget" when it ranks #1 for "blue widgets" that tells me I should probably change the title to the singular version. In the end I usually find I get the best results when I don't think too hard about it and just go with my gut when writing. I know that's not scientific or anything, but if it works it works.
Intermediate & Advanced SEO | | Everett0 -
Facebook sharing and referral links. Is there a happy medium?
That might work. The way our tracking links are setup, it's domain.com?ref=1111&goto=/this/path then it redirects the user with a 302 to domain.com/this/path Would changing it to a 301 be damaging because it includes the referral link and isn't a natural backlink?
Social Media | | JDatSB0 -
What is the best approach for getting comments indexed, but also providing a great UX?
You are absolutely right. That's why I avoid systems like Disqus and IntenseDebate. Comments = user generated content. I want this to stay on the page as a plain HTML and get indexed.
Intermediate & Advanced SEO | | egorpe0 -
In-house search engine
I've looked at Google CSE yes. It doesn't provide enough control over result styling and formatting, and isn't as up to date with URLs as I would like.
Vertical SEO: Video, Image, Local | | JDatSB0 -
What is "evttag=" used for?
Well, it's certainly not an SEO thing. In fact, it's not even a valid HTML thing; here are all of the craziest, acceptable attributes for a DIV in HTML5: http://dev.w3.org/html5/spec/single-page.html#the-div-element It among many, many other things, is breaking realtor.com's code through the validator: http://validator.w3.org/check?uri=http%3A%2F%2Fwww.realtor.com&charset=(detect+automatically)&doctype=Inline&group=0 In my experience, this one of the easiest ways to assure that Google can't crawl your site well, or put confidence in your user experience. Other crazy new standards, like schema.org do get pretty creative with structured data, but you'll notice that those use itemscope/itemprop/itemtype .... which do still pass the W3 validator: http://schema.org/docs/full.html In addition, there's no reference to an outside standard, even one that's totally foreign to what's generally accepted web development (similar to say, Facebook+OpenGraph, which is sometimes used, like this): xmlns="http://www.w3.org/1999/xhtml" xmlns:fb="http://www.facebook.com/2008/fbml" xmlns:og="http://opengraphprotocol.org/schema/"> I award realtor.com's web developers 0 points, and may the flying spaghetti monster have mercy on their SEO's. To put that another way: don't copy this.
Technical SEO Issues | | CoreyNorthcutt0 -
Maximum <title>length - use full or shorten?</title>
Thanks, that's more accurate to the question I was asking. Appreciate the response.
Technical SEO Issues | | JDatSB0