Anthony, thanks so much for chiming in. My gut instinct is to agree with you. I think I'm going to sit on this one and think about it for a little bit. It will also give me some time to formulate the best approach if I do decide to ask him to change the links to "nofollow."
Posts made by danatanseo
-
RE: Paid links that are passing link equity from a blog?
-
RE: Paid links that are passing link equity from a blog?
I really like this Idea. I think diplomatically approaching it is key. We do have one person at our company who is good friends with this particular blogger. I need to use him as my emissary, but it will be a hard sell. Within the first month of me coming on board here he introduced himself to me this way: "Hi, my name is xxxxxx and I don't give a crap about SEO."...so I will have an uphill battle there, LOL. Ah well, 50% of SEO is diplomacy isn't it?

Thanks for responding. I appreciate it very much.
-
RE: SEO can id and class be used in H1?
Hi Keith,
This it totally fine. Using class within the
is very common. The ID isn't a problem. Googlebot still understands that this is an
. I think as time goes forward, some of these technical SEO elements will become less important. Search engines are going to understand, particularly as HTML5 becomes more widely adopted, that <h>tags are used as much to define stylistic elements as structural elements. Hope that helps!
Dana</h>
-
RE: Hello I have an ecommerce site where many of the products are variations.
Hi Edward,
This is a very common problem in eCommerce. Although I don't have experience with Yoast's Woo Commerce, I imagine that there are several possibilities on how to deal with this issue.
Depending on how you handle your inventory there are a few options.
If you assign specific part numbers to all the different "versions" (i.e. lengths, colors, flavors, sizes, etc), which I think is probably most often the case, I would look in Woo Commerce to see if there is a way to set up a "Parent Child" relationship between a master product page and all the various versions of a product that might be available. Usually this results in creating a "Parent" part number where the master page "lives" and then all the separate version become children of that page. When someone makes their selection, the system pulls the specific part number corresponding with their selection and puts it in their cart. Technically, the children products still have "pages," but if you use a canonical tag on the parent page, leave it off the individual children "pages" and also hide those pages from being crawled and indexed, you will have solved your problem. There are a number of ways to keep those "child" product pages from being crawled and indexed, either via robots.txt or "no crawl, noindex."
The other choice, which is also probably available in Woo Commerce, is to set up a product page (again, you would have to create an Item ID or Part number for it that won't necessarily be a real product in your inventory). Then, create options for all the different versions of the items that are then presented to visitors via a drop down menu, radio buttons, etc. Again, you can usually correspond these product "options" to specific part numbers in your database. The difference is really presentation. When you use a drop-down menu selector, your visitor will only be able to select one part number for their shopping cart at a time. They would have to revisit the page and re-select different options if they wanted the same item, for example, in different lengths. The Parent/Child solution is nice is you want to allow your visitors to order 5 10-foot cables and 6 20-foot cables all on the same page at the same time.
Every platform is a little different. I have used these solutions in both Volusion, 3Dcart and seen them in Magento. It can take some back and forth to get things laid out and functioning the way you want, but from an SEO viewpoint the work is worth it because the "hub" pages that you in effect end up creating have a lot more value to searchers than having to visit nearly identical pages for cables that are exactly the same, but differ only in lengths.
Long answer I know. But I really hope it helps!
Dana
-
Paid links that are passing link equity from a blog?
We have a well-known blogger in our industry with whom we've had a long-standing relationship. We've had inbound links from his blog for many, many years. Today I noticed that we are running a banner ad listed on all pages of his blog under a heading that says "Sponsors."
He has dedicated an entire page of his site giving full disclosure of all advertising. However, all of the links on his site pointing to us are passing link equity. To my knowledge they've been this way ever since they were first established years ago.
I am fairly certain this fellow, with whom we have an excellent relationship, neither knows nor cares what a "nofollow" attribute is. I am afraid that if I contact him with a request that he add "nofollow" attributes to all of our links that it will damage our relationship by creating friction. To someone who knows nothing and cares nothing about SEO, asking them to put a "nofollow" on a link could either seem like a technical request they don't know how to handle, or something even potentially "shady" on our part.
My question is this: Considering how long these links have been there, is this even worth worrying about? Should I just forget about it and move on to bigger fish, or, is this a potentially serious enough violation of Google Webmaster guidelines that we should pursue getting those links "nofollow" attributes added?
I should add that we haven't received any "unnatural" link notifications from Google, ever, and haven't ever engaged in any questionable link-building tactics.
-
RE: Please help me articulate why broken pagination is bad for SEO...
Thanks so much Gianluca for this thoughtful and valuable advice.
Yes, page load speed is definitely something that's been a concern. This is why we went back to 24 products displayed per page instead of 50 a few months ago. However, since then we've made some significant improvements in page load times and we think we can probably go up to 100 products per page and still be fairly fast. We will have to test.
On the up side, we only have 7 categories with more than 100 products, and only 24 with more than 50. The biggest problem we have effecting speed isn't so much the images. It's the fact that the website does real-time pricing calls on every product to ou business back end every time the page loads. This may be a sticking point.
I have also thought about the canonical tag problem. Of course, it's a problem now too, but if the "View All" page just ends up getting that generic URL and no proper canonical tag...then we really are back to square one.
The possibility of no-indexing all of the categories that are related to paginated series is something that crossed my mind yesterday, so it's interesting that you mentioned that. While it would solve certain issues, wouldn't this be a problem in terms of having valuable content in Google? Granted, some of our category pages are purely there for navigation purposes, in which case, I suppose there's no harm in no-indexing them. However, with the roll-out of Hummingbird I began looking at our category pages as valuable opportunities for "topics" pages that could act as a hub for visitors searching for products or information around specific uses or brands.
Wouldn't there be a significant risk in losing valuable market share for key terms by removing so many category pages from Google's index?
If I am understanding your last suggestion you are saying to have the page default to "View All" and noindex everything else...You are right, not a great scenario, but you are also right in that this may be the only solution given management's steadfast stance on not wanting to pay to fix it.
Lot's to think about, but your comment has been extremely helpful. Thanks again!
-
RE: Please help me articulate why broken pagination is bad for SEO...
Thanks so much EGOL. I always love your candor.

Believe me, when I went home last night to ponder solutions to this problem, everything you mentioned crossed my mind. It was a thoroughly frustrating conversation to have. It simply amazes me that Google can tell the world very clearly all the things that will help their sites do better in the SERPs, yet people continue to ignore all of that advice, do what they want (or whatever is "easy" or cheap), and then whine about why their sites aren't doing well.
Making the commitment to hire an in-house SEO without equipping them with good tools and refusing to take their advice is like hiring an astronaut, handing them a box of toothpicks and some gunpowder and saying you expect them to land on the moon.
-
RE: Please help me articulate why broken pagination is bad for SEO...
Thanks so much Andy. Agreed on all points. I think I have convinced the powers that be that at the very least we should add a "View All" option. This would give both end-users and Google a useful means to access all of the products in a category at once, without having to resort to pagination if they didn't want to. It is something we can add fairly easily and at little to no cost. Since only 8 of our category pages have more than 100 products, and none go higher than 200, this seems like a very reasonable compromise, at least for now.
I very much appreciate you taking the time to respond
It was a frustrating day and a frustrating conversation to have to have. -
Please help me articulate why broken pagination is bad for SEO...
Hi fellow Mozzers.
I am in need of assistance. Pagination is and has been broken on the Website for which I do SEO in-house...and it's been broken for years.
Here is an example: http://www.ccisolutions.com/StoreFront/category/audio-technica
This category has 122 products, broken down to display 24 at a time across paginated results. However, you will notice that once you enter pagination, all of the URLs become this: http://www.ccisolutions.com/StoreFront/IAFDispatcher
Even if you hit "Previous" or "Next" or your browser back button, the URL stays: http://www.ccisolutions.com/StoreFront/IAFDispatcher
I have tried to explain to stakeholders that this is a lost opportunity. That if a user or Google were to find that a particular paginated result contained a unique combination of products that might be more relevant to a searcher's search than the main page in the series, Google couldn't send the searcher to that page because it didn't have a unique URL. In addition, this non-unique URL most likely is bottle-necking the flow of page authority internally because it isn't unique. This is not to mention that 38% of our traffic in Google Analytics is being reported as coming from this page...a problem because this page could be one of several hundred on the site and we have no idea which one a visitor was actually looking at.
How do I articulate the magnitude of this problem for SEO? Is there a way I can easily put it in dollars and cents for a business person who really thinks SEOs are a bunch of snake oil salesmen in the first place?
Does anyone have any before and after case studies or quantifiable data that they would be willing to share with me (even privately) that can help me articulate better how important it is to address this problem. Even more, what can we hope to get out of fixing it? More traffic, more revenue, higher conversions?
Can anyone help me go to the mat with a solid argument as to why pagination should be addressed?
-
RE: Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
I read your post at Mstoic Hemant and noticed your comment about Firefox 10. Since I couldn't get Dust-Me Spider to work in my current version of Firefox I tried downloading and installing the older version 10 as you suggested. When I did so, I received the message that the Dust-Me Spider was not compatible with this version of Firefox and it was disabled.
We are considering purchasing the paid version of Unused CSS (http://unused-css.com/pricing) - Do you have any experience using the upgraded version? Does it deliver what it promises?
Thanks!
-
RE: Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi Hemant,
I tried using Dust-Me in Firefox, but for some reason it won't work on this sitemap: http://www.ccisolutions.com/rssfeeds/CCISolutions.xml
Could it be that this sitemap is too large? I even tried setting up a local folder to store the data, but everytime I try the spider I get the message "The sitemap has no links."
I am using Firefox 27.0.1
-
RE: Can Googlebot crawl the content on this page?
Thanks so much Bill and Brian. This is exactly what I was thinking. I did the same thing Bill suggested initially and took a snippet from one of the reviews and did a verbatim search and got nothing. What I thought this told me was that yes, the page was indexed, but not the content. The fact that the cached version renders the content from the javascript only shows that the script was executed, not necessarily that any of the content it contains was actually indexed.
From an SEO standpoint I think this is valuable content that the dealer would very much want indexed. While the service providing the javascript might be very convenient, and the majority of end users might be able to consume the content, the fact that it's not searchable, to me, means it's an opportunity lost.
Thanks again everyone.
-
RE: Can Googlebot crawl the content on this page?
Hah! Thanks Andy. Must not have had enough coffee this morning. I didn't even think of looking at the cache...so obvious, lol! Thanks so much. You are spot on.
-
Can Googlebot crawl the content on this page?
Hi all,
I've read the posts in Google about Ajax and javascript (https://support.google.com/webmasters/answer/174992?hl=en) and also this post: http://moz.com/ugc/can-google-really-access-content-in-javascript-really.
I am trying to evaluate if the content on this page, http://www.vwarcher.com/CustomerReviews, is crawlable by Googlebot? It appears not to be. I perused the sitemap and don't see any ugly Ajax URLs included as Google suggests doing. Also, the page is definitely indexed, but appears the content is only indexed via its original source (Yahoo!, Citysearch, Google+, etc.).
I understand why they are using this dynamic content, because it looks nice to an end-user and requires little to no maintenance. But, is it providing them any SEO benefit? It appears to me that it would be far better to take these reviews and simply build them into HTML.
Thoughts?
-
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all,
So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit.
I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS?
Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time?
Thanks!
-
RE: Why does expired domains still work for SEO?
Greetings, I am going to weigh in here, not because I am any kind of Yoda at all, but purely from a common sense point of view. I hope that's okay.
I would deduce that if anyone was able to know when a domain was released and how soon it sold thereafter it would have to be the domain registrar. So, let's say, hypothetically, that some domain registrar decides they are going to start publishing a list of domains that were released for sale and then sold immediately. Then let's say Google gets a feed of that list and just automatically, via the algorithm, discounts every single one of those domains down to PR 0, and strips them of all potential link authority value...
I'm sure you can see dozens of problems with that scenario. Here are just a few:
1. No one can really evaluate the new owner's identity or purpose without knowing who the new owner is. If registrars disclosed that information, I can't even imagine the number of privacy issues that would arise.
2. The assumption would be being made that the new owner is not the same, related to the old company. I'm sure there are plenty of cases where this happens.
3. Google would be making the assumption that the selling of the domain to a new domain owner was to end the business. Again, there are probably many many instances when this is not the case.
It seems to me that Google, nor any other search engine, can reasonably deduce the motives of a new domain owner. I mean, there are some smart folks at Google, but I don't think clairvoyance has entered the algorithm yet. Consequently, it probably seems more reasonable to let expired domains retain some of their value with the belief that most business owners are only going to buy domains relevant to their business and that end users will cast their "votes" for how well these new owners use the real estate by exhibiting either engagement or bouncing and viewing another site. Eventually, the algorithm will more or less accurately sift through the results and serve up results that visitors find engaging.
Sure, maybe it works for a year, two years, hell, even three years. So maybe this approach is viable, for now for a website or a page that just seeks short term benefits. But, if what you are building is a business that you want to last, a brand that you want to matter to people 20, 50, 100 years from now? Then I think there are far better uses of your time, effort and resources.
-
RE: How complicated would it be to optimize our current site for the Safari browser?
Thank you all so very very much. Matt, I am going to drop you a PM as you suggested. Paul, wow...thank you for sharing your insights here. I am sure this is extremely helpful information not just for me, but for many other folks here who've observed some of the same things on their sites. Yes, I agree that the conversion issue could very well be a mobile optimization problem and not a Safari problem as we have spent very little to no time optimizing our mobile site. Given the traffic levels coming in via that channel, it's probably time to get crackin'!
You guys are awesome!
-
RE: Should I allow a publisher to word-for-word re-publish our article?
Hi David,
I understand your concerns about guest blogging, however, I think you can share your article with other sites, i.e. "syndication," if you just take care of some details. First and foremost, make sure it's a site that's relevant to you or your potential audience. It sounds like it is, so you're probably good to go there. Second, make sure you have a canonical tag in place on your original content. This may or may not matter in terms of how Google attributes the content if the site you post to is a higher authority site than yours, but that's okay because what you're after is the audience and traffic, not the link or link equity. Lastly, to assuage your concerns about any potential penalty from being associated with something that says "guest blog" on it, ask that you get attribution, but that any links back to your site are given the rel="nofollow" attribute. This is something really out of your control, but you can at least attempt to cover that base.
Above all, no matter what, make sure you get full attribution and that you or whomever wrote it is listed as the author.
We have syndicated many of our articles to blogs and online magazines who appeal to our audience. Sometimes the content gets attributed to the blog even if it appeared on our site first if the blog is a high-authority site. Sometimes we even end up getting followed links back simply because the blog editor doesn't know how to do "nofollow." Like you, we don't do it all over the place, but instead are very selective and only offer specific pieces to specific places. If you think about it, a huge amount of news content online is syndicated. Syndication has always been an accepted way of sharing content. As long as it's done for the purposes of providing interesting information to a particular audience instead of for the sake of a link, I think you're perfectly fine doing so.
Hope that helps!
-
How complicated would it be to optimize our current site for the Safari browser?
Hi all! Okay, here's the scoop. 33% of our site visitors use Safari. 18% of our visitors are on either an iPad or iPhone. According to Google Analytics, our average page load time for visitors using Safari is 411% higher than our site average of 3.8 second. So yes, average page load time pages loading in Safari is over 20 seconds...totally unacceptable, especially considering the large percentage of traffic using it.
While I understand that there are some parameters beyond our control, it is in our own best interest to try to optimize our site for Safari. We've got to do better than 20 seconds. As you might have guessed, it's also killing conversation rates on visits from that browser. While every other browser posted double-digit improvements in conversion rates over the last several months, the conversion rate for Safari visitors is down 36%...translating into 10's of thousands in lost revenue.
Question for anyone out there gifted in Web design and particular Web Dev....Do you think that it's possible/reasonable to attempt to "fix" our current site, which sits on an ancient platform with ancient code, or is this just not realistic? Would a complete redesign/replatform be the more realistic (and financially sound) way to go?
Any insights, experiences and recommendations would be greatly appreciated. If you're someone interested in spec'-ing out the project and giving us a cost estimate please private message me. Thanks so much!
-
RE: Can anyone speak to the pros and cons of installing mod_expire on an Apache server?
Hi Thomas,
My apologies for the extremely long time it took me to respond. Once Fall hit we got very busy! Thanks so much for the info on hosting and CDN, this is very, very helpful. We have just gone through several months where we experienced a significant amount of downtime, so I am hoping that if I kick and scream enough and show these folks how downtime impacts their revenue, they will finally make a change. Things move very, very slowly in small businesses, so I will have to be patient, but hopefully we can get there. Your suggestions have been pure gold and I really, really appreciate you taking the time to share your expertise.
Dana