False Negative Warnings with Crawl Diagnostic Test
-
Hey Ryan, thanks. Thumbs up! I like your answer more

I should have looked further into this question. For some reason, I read the beginning and assumed we were talking about shopping cart style, check out pages.
And, you're right. Adding a nofollow to those links is a weak way of addressing Anthony's issue. Thanks for keeping me in check Ryan
-
Dang... thanks Ryan for such an in-depth response! Gimme a few to take it all in and I may follow up with a few more questions. And Donnie, I appreciate the attention as well!
-
Hey Ryan,
So from the answer you provided... we've been on a long journey trying to resolve the aforementioned issues. We took a little break over the 4th but then got back to it a few days ago. For the most part, I believe we have at least concluded what needs to be done, or if anything can actually be done regarding a solution. It is a bit tricky because we are working with Volusion (a 3rd party "shopping cart" service) which definitely limits flexibility. The are a lot of 'pros' with a service such as Volusion (especially with limited resources and knowledge) but the 'cons' are beginning to appear as our knowledge of SEO, web design, etc begins to grow. Anyway, I wanted to just provide a thorough response to your answers and also throw in a few other questions that arose since.
1. In regard to the 302 Temporary redirects on all product specific shopping cart pages... we can not apply a 301 redirect because it will then not allow any customers to actually access the shopping cart page after adding a product to their cart. We were told, "if you redirect from the shopping cart page, the customer will not be allowed to checkout. Each page is needed so taking them out will cause error to the site." I was then told to speak with their marketing services department on the issue as they will be able help solve my SEO needs. i have emailed them and hopefully will hear back. Most likely there is no way to resolve this issue.
2.) You said our duplicate page and duplicate content issues can be resolved with canonical links. As you noticed, there are canonical links on the product, category and homepage of our site. I wanted to mention that there is an "SEO friendly" way to apply these canonical links with Volusion. You just select a button that says, "enable canonical links" in the back end of the store.
After speaking with Volusion support on this matter, we basically concluded that they forgot to apply these links to the 'Email a Friend" and "Email When Back In Stock" pages. I have sent the SEO department an email on this as well and expect to get one of the following three responses.
1. "We will look into this as a future feature request"
2. "There is nothing that can be done"
3. "We know about this but don't worry, it will not impact your search rankings"
Either way, if they tell me there is not a short term solution... I will look into applying a "no index, follow" tag.
3. I did not mention this issue in my initial question but we are also receiving a 'warning' of "too many links on page". In regard to keeping our on-page links to under 100...other than the homepage and product/category index pages, we have done a pretty good job with limiting the amount of links per page. With that said, we have run into somewhat of an issue with category pages that have 70+ products assigned. We have set the default to show 60 products per page but it appears the crawlers are picking up all products (even the ones on the 'next' pages) for that page which is making the links per page very high. For example... the below link is showing 244 on-page links.
http://www.beautystoponline.com/Ardell-False-Eyelashes-s/71537.htm
There is no way there is that many links on this single page. But there are probably almost 200 products assigned to this category. Which explains the high number of links. We were told this is occurring, "due to the fact that all Category pages are generated as "search results" pages (based on the category filter), and because of this, there is very little you would be able to do, as the code that generates search/category pages is system code that cannot be modified."
We were also told that we could submit it as a feature request on their forum and if it is an idea that's popular amongst other merchants, their developers may take it into consideration and change how the links are coded in the future. Opposite of all this... by chance to you have an opinion/suggestion of a solution? (if any)
A quick side note on this topic... back to me mentioning our category and product index pages are showing thousands of on-page links. It is self explanatory to why this is happening.. but would you say it is a bad thing for SEO purposes? I know its good for site structure and passing link juice, meaning that all pages on our site are only 1 click away from the root domain. Right?!?!?!?
4. Another issue I did not previously mention was 'META titles over 70 Characters'. I just wanted to confirm that if a title is more than 70 characters, the only negative is the truncated title and the full name won't appear in the search results. Past that, there shouldn't be any negative effective from Google search rankings from this, right? We have a few of these issues but for the most part... the time it would take to correct a few characters over 70 is not worth it if there is no impact on search rankings.
Anyway man... if you do reply to this 2nd post, your time is greatly appreciated and i thank you

-
Hello again.
Thanks for sharing the information on #1 and 2. I have heard of Volusion before but have no experience with them. Based on what you have shared it seems they may not be a great solution from a SEO perspective.
For #4, you are correct. The "META titles over 70 characters" is a warning that long titles will be truncated. The other main consequence is a title's weight is divided amongst the words in the title. The longer the title, the less weight that is applied to each term in the title. If you know and understand these factors, you can choose to ignore the warning.
For #3, you definitely do not want "thousands of links on a page". You need to figure out a way to significantly lower the number of links. Search engines will follow a percentage then stop. Yes, I would say this is bad for SEO.
Somehow you need to categorize the links. Many blog sites will show group links by month for the current year, and by year for past years. You could group by categories. Do something to get your number of links under control. You don't have to be under 100, but for now I would say you should be under ?250 links.
-
Hello Ryan,
Thanks again for the reply... your time is appreciated. We are currently working on creating a site map to 'categorize' the links in both our product and category indexes. This should take care of the two highest numbers of on-page links across our site. The majority of these warnings are under 250 links so we should be good. Or, let's hope cause there really isn't anything we can do about it at this point. Also, by chance do you know of or can you refer a company / independent who designs site maps? We have the xml site map file generated from Google, we just need someone to make it look nice.
Oh yeah, regarding all those duplicate title and duplicate content errors... they should be taken care of with a disallow robots.txt file. With that said, on our last SEO moz crawl the errors still came up on those same "email a friend" and "email when back in-stock" pages. Now... I did submit the robots.txt file during the past scan so this may be the reason. So before I start to wonder any further, I am going to wait until the next crawl is complete. Maybe you might know... into the future, will SEO moz still pick up those duplicate page and title errors in the crawl with the disallow robots.txt file
Also,our webmaster tools is showing 180 "restricted by robots.txt" crawl errors... all from "email a friend" and "email when back in-stock" pages in which the disallow robots.txt was just placed. I understand that even with the disallow robots.txt file, Google can still crawl whatever it chooses. Is this anything that we should be concerned about? Also please note that we have 1000's of these pages and webmaster tools is only showing 180 of them.
Thanks again for your help
-
Anthony,
You can begin a crawl of your site anytime. Click on Research Tools from the menu bar and scroll down to On-Page Optimization Tools > Crawl Test. This will allow you to confirm your robots.txt settings are set correctly.
For sitemaps, http://xml-sitemaps.com/ seems to be quite popular. I would suggest checking them out first. They offer a free test for up to 500 pages, and it is $20 USD to buy their product if you like it.
For Google WMT, the "restricted by robots.txt" errors can be disregarded if you are confident the pages should be blocked. I would recommend allowing Google to crawl your site whenever possible and using the noindex meta tag to prevent the pages from being indexed. This approach would eliminate those errors.
-
thanks a lot man. I'm going to check out that site map site. Also, I'm going to look into applying those "no-index,follow" tags on the pages instead. Thanks again

-
Hey Ryan,
So I just confirmed with Volusion that certain pages such as these can have the "no-index, follow" tag and certain pages can not. It's just the way their system is setup. So with the pages that can, I will for sure apply the "no-index, follow" and for the pages that can not, II will go ahead and apply a disallow robots.txt. Also, if you wouldn't mind confirming... it's the "no-index, follow" meta tag that I should apply? Not the "no index, no follow" tag?
Thank for all of your assistance and guidance through all of this trouble-shooting!
-
As a rule, don't use "nofollow" on internal links.
-
Thanks mate, I have been searching for a couple days on how to fix that warning.
-
I just set up footers that are on every page to nofollow sites that I care about because otherwise they get thousands of pages linking all from the same domain - this can't be good for the actual site.
I then made a single link to the sites I care about which is followable. I am hoping this is a good strategy. Sorry to digress from the original interesting topic. -
If you trust the target site, follow the link. If you don't trust the target site, nofollow all the links.
If you feel the footer links will actually be seen and used, keep them. If they are not likely to be seen or use, I would suggest removing them.
-
To the OP,
We are also on Volusion and have found that adding the Meta Robots tag for noindex, follow in the meta override area for categories has worked for us. We haven't found a way to add it however to the SearchResults page at this time.