Thanks for following up Chris,
Yes it does sound a bit shady when they don't respond to questions about their product. Let that be a lesson to us all about customer service.
Cheers,
Everett
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thanks for following up Chris,
Yes it does sound a bit shady when they don't respond to questions about their product. Let that be a lesson to us all about customer service.
Cheers,
Everett
Hello Ben,
Have you already made the change? Every product I looked at from that page had only one size/color/material, etc... What is the purpose of the change? Will the simple products AND the grouped products to which they belong all be indexable on their own URLs?
Grouping products is one way to reduce the amount of product pages on the site by allowing shoppers to choose options from a single product page (e.g. size, color, material...) that would otherwise be listed as its own sku. If your goal is to allow shoppers to view each as its own product then that's fine as long as you're willing to write 100% unique, useful copy/descriptions for each of those pages. If not, it would be better to keep them grouped.
As for the category page, I don't think it matters much what products are being featured there in terms of how the category page will rank. It will affect the pagerank being distributed to product pages though, so rankings could drop for your grouped product pages if you are only linking/showing simple products on category pages.
Without a little more detail about how you group products and what you're trying to accomplish it's hard to provide a great answer.
Hello Chris,
I had a look at the two products you mentioned and can't really say whether they would be good or bad choices without seeing some specific sites that are using them. Neither of the pages provided links to any demo or case study sites so I couldn't really make a determination. They "sound" good but you'd have to pay $77 and $47, respectively, to see them in action anywhere else outside of the site.
However, they do look to be using their own platform on their site and I had a quick look at the code, which looks fine to me. The text appears in the code and on the page, the markup appears to be correct, and it seems to be working in the SERPs.
Hello,
The plugin creates a new CSS file, which you can access and control to make the data appear any way you like:
http://historyofmormonism.com/wp-content/plugins/schema-creator/lib/css/schema-style.css?ver=1.050
Yes, the data that you are marking up needs to appear on the page, but you can mark up existing data instead of using a plugin that adds new data to the page. Or you can use the plugin to add the data to the page and customize the look by adjusting the CSS file.
My advice would be to learn how to add the markup yourself so you can just apply it to whatever it is on the page that you're trying to mark-up instead of relying on the plugin. This way you don't get the ugly box. It isn't that difficult if you can already do basic HTML. For example:
Here is what the plugin is doing:
Alex Baugh
Brigham Young University
Professor
The Div ID for "Schema_block" is what is creating that container. In the CSS file I linked to above it reads as follows (bolding is mine):
#schema_block {
clear:both;
margin:0 auto 10px auto;
** background: #EEEEEE;**
** border: 1px solid #CCCCCC;**
padding: 1em;
overflow: hidden;
}
You can make that look like whatever you want it to look like. For instance, you could completely remove the margin, background, border and padding. It's not the "box" that Google needs to see, it's the content (e.g. Alex Baugh, Brigham Young University, Professor) on the page and the markup in the code that surrounds it.
I hope this clarifies the issue for you. A good designer/developer should be able to provide further assistance if you are not comfortable editing code or CSS files.
Hello David,
This is an issue for many sites at the moment, I think. I just finished diving really deep into another site having the same problem, and we ended up - after fixing everything that "could" be wrong - deciding that Google has probably tightened the reigns on rich snippets displaying for product pages. Whether this is based on page-level authority metrics, number of reviews, trustworthiness of the reviews, etc... I do not know.
It is not guaranteed that a page will display snippets in the SERPs even if the code is implemented correctly. That is about all we can know at the moment. The same goes for rel author thumbs and other enhanced SERPs.
Hello Santaur,
I'm afraid this question isn't as easy as you may have thought at first. It really depends on what is on the pages in those two directories, what they're being used for, who visits them, etc... Certainly removing them altogether wouldn't be as terrible as some people might think IF those pages are of poor quality, have no external links, and very few - if any - visitors. It sounds to me that you might need a "Content Audit" wherein the entire site is crawled, using a tool like Screaming Frog, and then relevant metrics are pulled for those pages (e.g. Google Analytics visits, Moz Page Authority and external links...) so you can look at them and make informed decisions about which pages to improve, remove or leave as-is.
Any page that gets "removed" will leave you with another choice: Allow to 404/410 or 301 redirect. That decision should be easy to make on a page-by-page basis after the content audit because you will be able to see which ones have external links and/or visitors within the time period specified (e.g. 90 days). Pages that you have decided to "Remove" which have no external links and no visits in 90 days can probably just be deleted. The others can be 301 redirected to a more appropriate page, such as the blog home page, top level category page, similar page or - if all else fails - the site home page.
Of course any page that gets removed, whether it redirects or 404s/410s should have all internal links updated as soon as possible. The scan you did with Screaming Frog during the content audit will provide you with all internal links pointing to each URL, which should speed up that process for you considerably.
Good luck!
I completely agree. 1 URL is by far the better choice.
Hello WMCA,
Does BV want you to add that rel ="canonical" tag to the main product page, or just the paginated review pages. If the former, I say no way. If the latter, we should discuss further. It could be advantageous, but I'd rather send the authority to the main product page instead.
Hello Yusuf,
If you have a link to Jon Mueller saying that, instead of someone else saying he did, I would love to go check it out because the statement is in direct opposition to the one on Google's website here, which says:
"Consolidating link signals for the duplicate or similar content. It helps search engines to be able to consolidate the information they have for the individual URLs (such as links to them) on a single, preferred URL. This means that links from other sites to http://example.com/dresses/cocktail?gclid=ABCD get consolidated with links to http://www.example.com/dresses/green/greendress.html."
Notice is says "helps" though. As always, the directive is a "hint" to Google, which has the right to ignore the hint if they want to.
Yusuf,
I do believe rel canonical tags merge the link profile of all non-canonical URLs to the one canonical URL.
Also, relying on redirects in this case could be problematic for breadcrumbs.
Hello Marty,
If you have the opportunity to use only ONE URL, to which you will link from all categories - and which will be the one and only canonical for that product - I would use site.com/product/product1. Note the use of a /product/ directory instead of being off the root. I find that having products in a product directory makes diagnoses of issues (i.e. index count, site:domain.com inurl:product searches, Analytics segmentation...) a lot easier. However, if you want to keep it site.com/product1 then that would be fine as well. It would be preferable to using multiple URLs and relying on 301 redirects or rel canonicals, which are effective band-aids, but band-aids nevertheless. It is better to actually fix the problem, which is products living on multiple URLs.
Of course you're going to still want to either 301 redirect or rel canonical the old ones to your canonical version since the URLs are likely already in Google's system and possibly have external links.
And you should think about what happens to breadcrumbs as well. If a user gets to /product1 from one category vs another, will their breadcrumb change and how will that be done? Is it ok for usability for the breadcrumb on that product page to always reference the canonical category (i.e. Home ---> category 2 ---> category2 ---> product1)? I tend to think so, and this also may help your internal linking be more consistent when Googlebot visits the page.
Andrew,
I wouldn't put crawlable versions of reviews found on other sites on my product pages for the same reason you are concerned.
You could put them in an iFrame, which should take care of the issue. However, in order to get the star ratings in the SERPs the Schema markup for aggregate ratings should be outside of that iframe. An aggregate rating doesn't include the review copy since it is including all of the reviews (e.g. 4 out of 5 stars based on 320 reviews).
Meanwhile, I would be working on getting your own unique, real reviews up on your site so you benefit from the user-generated content that nobody else has.
Hello Carl,
I tried to access those pages but didn't find anything on those URLs even after fixing them.
The rel next/prev is being used properly for the issue you're having.
I think the bigger issue is that you say all of your product descriptions are duplicates of any other site using the same affiliate feed. That is a big no-no and I don't think relying on category pages to rank for product searches is a good business plan. I hate to say it, but these types of sites really just don't do very well these days.
Nevertheless, your question was about the paginated URLs and I think the rel next/prev has you covered there. If you are more worried about it you can use a rel = canonical tag to ensure the first page in the series is seen as the canonical version.
The only thing I might add is that, depending on the business, it might be worth building a "Red Widgets" category (as an example). However, you would treat this like a sub-category and write its own category description. You would give it its own rel canonical tag, treating it as the root of the "Red Widgets" category root.
Nine times out of ten it isn't necessary to give sorting and filtering options their own category page though, and a rel canonical tag to the canonical version of that page is the second best option. The first best option would be to not change the URL at all, only re-order the items, hiding some and featuring others. Most eCommerce platforms don't have this functionality at present, however. Rel Canonical was made to span the gap until they do.
Hello Carsten,
I am not a developer, but hopefully I can be of some assistance. If not, we'll leave the question open to see if anyone else can be more specific for you.
First, I came across a good list of eCommerce sites using Isotope: http://isotope.metafizzy.co/ . If you scroll down to "Isotope in use" you'll see examples from Anthropologie, Rimmel London, Lexus, Biodroid, etc... Perhaps looking at how they handle things will help. For example, it looks like Anthropologie uses unique identifying URL for each product that is its own .jps file (e.g. 4130265414412.jsp), or is meant to appear that way.
A lot of AJAX sites will use a hashbang (#!) to enable crawling of dynamically created content on the same URL each time. Here is more information about that.
The solution you provided above sounds good, but I'd want to see it in action first. Be sure to view the page as Googlebot, and to see how it looks in Google's Cache Preview in the SERPs to make sure they are treating it the way you want them to.
Good luck with this, and please let us know how it turns out so we can be more informed on the issue in the future.
Hello David,
In the case of http://www.godirectappliance.com/ajax/product/quickview/id/734/ I would simply add a disallow statement for the/ ajax/product/quickview/ directory in the robots.txt file.
I'm assuming the URLs to these pages are contained in javascript instead of href tags since they are hover-over pages so you don't have to worry about wasting pagerank by linking to pages that are being blocked.
I'm not recommending a block at the /ajax/ or /ajax/product/ level at this point because I don't know what else is in those directories. It could be something you need Google to have access to in order to fully index a page.
I have seen this happen with sites that have been redirected. For example, I redirected site A to site B once - including all of the content from site A - and site B inherited the Titles and descriptions from site A for several months. It was very irritating because I didn't use the words anywhere on the page, and I was using NOODP and NOYDIR tags. I even told Google the site had moved in GWT.
Eventually it sorted itself out, but it made me see how difficult it must be fore Google to keep everything straight on so many sites with so many different site elements, especially when redirects are involved.
Both sites had content and were in the the same niche. Site A had stronger signals, and was a year older than site B. However, I wanted to redirect to site B for branding reasons.
I suggest you do some digging around to find out if any other site has been 301 redirected to your site. If so, did that content ever appear on it?
Good luck and let us know what you find out!
Helen,
I know people who have had success in reducing the number of links within mega menus by turning some of them (after the first two levels, for instance, but you could get much more sophisticated if you wanted) into javascript links. If the javascript is not too complex Google will still have no trouble getting to those pages, but the links won't be "hrefs" and therefore won't waste pagerank on pages that are not as important relative to the others. The upside to this is that the links are still there for users, assuming that is a good thing.
As someone else mentioned, consider whether having those links there really is good for the users, or if they'd rather see a simpler menu. The search engines are beside the point in that case.
Any time an eCommerce site experiences slow, steady traffic drops I always look into the uniqueness of their product copy. That is often a sign that they are sharing product copy with other sites, either due to manufacturer description use, or by publishing feeds to 3rd party sites like Amazon, eBay or price comparison shopping engines.
Good luck!
As I said before, a 301 redirect will pass pagerank. Even if it goes to a blocked folder, that's still domain-level benefit coming into your site from "paid" links.
The best solution, in my opinion, is for sites to run their affiliate program through another domain first, and 302 (temporary) redirect the user to the main site.
Affiliate links to www.YourAffiliateDomain.com/?afflink-id=123, which has a domain-wide robots.txt disallow. The ?afflink-id=123 part tells the system where to redirect the user to on the primary domain. The user goes from that URL through a 302 redirect to the appropriate URL on your primary domain.
No pagerank is passed and you can kill off the domain if you ever need to and those redirects will stop coming into the site.
If you are unable to do all of this you can submit a disavow file for all non-compliant affiliate domains after asking them to nofollow their links. I think the limit is supposed to be 2,000 domains, but I've heard of people doing as much as 4,000 with no problem. Just give it a try and see what happens.
Sometimes catalog businesses have customers who feel more comfortable with an online version of their printed catalog than an eCommerce site. In such cases it can be a good user experience to have those available online.
However, I would not allow them to be indexable.
I'm assuming the products are all available on product pages in the eCommerce site, which is where I would focus my efforts. The online version of the printed catalog in PDF (Other formats, such as flash or AJAX work well too) should just be a feature for users, not necessarily for search.