Sorting Dupe Content Pages
-
Hi,
I'm no excel pro, and I'm having a bit of a challenge interpreting the Crawl Diagnostics export .csv file.
I'd like to see at a glance which of my pages (and I have many) are the worst offenders for dupe content – ie. which have the most "Other URLs" associated with them.
Thanks, would appreciate any advice on how other people are using this data, and/or how 'Moz recommends to do it.

-
I've found this a little frustrating, too. The display on the web will show the number of duplicate URLs, but the exported spreadsheet does not. It does, however, list all of the duplicate URLs in one cell -- so you could calculate the character length of that cell and then sort by that column, and that would give you a rough ranking.
-
CMC is correct - thats how I do it for larger sites.
- delete all columns except the URL column (col A) and the duplicate pages column (now Col B)
- in cell C2, enter this formula: =len(b2) it will calculate the characters in dupe pages cell
- drag that cell down to last row
- select all three columns and sort col c by largest to smallest
Obviously this isn't going to give you an exact number of dupe pages since URL text strings can vary in length, but it does give you a pretty good idea of the worst offenders....