How to handle sorting, filtering, and pagination in ecommerce? Canonical is enough?
-
Hello,
after reading various articles and watching several videos I'm still not sure how to handle faceted navigation (sorting/filtering) and pagination on my ecommerce site.
Current indexation status:
- The number of "real" pages (from my sitemap) - 2.000 pages
- Google Search Console (Valid) - 8.000 pages
- Google Search Console (Excluded) - 44.000 pages
Additional info:
- Vast majority of those 50k additional pages (44 + 8 - 2) are pages created by sorting, filtering and pagination.
- Example of how the URL changes while applying filters/sorting:
example.com/category --> example.com/category/1/default/1/pricefrom/100
- Every additional page is canonicalized properly, yet as you can see 6k is still indexed.
- When I enter site:example.com/category in Google it returns at least several results (in most of the cases the main page is on the 1st position).
- In Google Analytics I can see than ~1.5% of Google traffic comes to the sorted/filtered pages.
- The number of pages indexed daily (from GSC stats) - 3.000
And so I have a few questions:
- Is it ok to have those additional pages indexed or will the "real" pages rank higher if those additional would not be indexed?
- If it's better not to have them indexed should I add "noindex" to sorting/filtering links or add eg. Disallow: /default/ in robots.txt?
- Or perhaps add "noindex, nofollow" to the links? Google would have then 50k pages less to crawl but perhaps it'd somehow impact my rankings in a negative way?
- As sorting/filtering is not based on URL parameters I can't add it in GSC. Is there another way of doing that for this filtering/sorting url structure?
Thanks in advance,
Andrew
-
This post is deleted! -
Canonical reference links are the preferred technique for this.
If you do nothing, very likely the search engines will decide for you which variations of your pages to index, and the selection may not be ideal. If an index page can be filtered many different ways, the unfiltered version should be referenced as the canonical on each, and a self-referencing canonical link should also be specified on the unfiltered version.
You don't really yet want to disallow the crawling of the refinement paths, because without canonicals implemented, you might very well do more harm than good, finding important pages getting de-indexed. If at some point in the future you find that all the URLs from the refinement paths have been disappeared from the index, and your desired pages are all indexed properly, then at that future date you might want to disallow crawling of the refinement paths (in your robots.txt file). But, not yet, IMO.
-
This post is deleted!