Errors - 7300 - Duplicate Page Content..Help me..
-
Hi,
I just received the crawl report with 7300 errors of duplicate page content.
Site built using php.
list of errors will be like this..
http://xxxxx.com/channels/?page=1
http://xxxxxx.com/channels/?page=2
I am not good in coding and using readymade script for this website. could anyone guide me to fix this issue?
Thanks.
-
This appears to be a pagination issue. If so, then the solution may be fairly simple. You have a few options. You might want to first make sure that your canonical tags are in order. How you do those will depend on whether or not you want pages in a paginated series (like a category page with more than one page of products listed) included in Google's index. If you want them indexed, then each paginated should have its own rel=canonical tag, specific to that page. If you really only want the first page included in the index, then you could include a tag like this at the top of each page in that
partcular paginated series. You may also need to include rel=next and rel=prev,
depends on your content.
Here is an excellent video on pagination from Google that describes various options,
depending on what type of content you have and how you would like it to be indexed:
http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html
-
Thanks a lot Dana. But in my report there is no warning for canonical issues. so, by adding the code, it will get resolved?
-
You are welcome. I would add ta canonical tag to the first page of any category or article page that results in multiple (paginated) pages. Watch the video, there are several different ways to go. One easy thing you could do is add a "View all" option link to the first page. Add your canonical tag to the page in its "view all" state, and that should resolve it. There really are 2-3 ways to solve the problem. It just kind of depends on your content and preferences. You will also want to make sure that you direct Googlebot not to crawl or index your search results pages. This can be done in your Robots.txt file.
Because it takes time for Google to crawl, index and recognize your new canonical tags, it might take a few weeks for the duplicate content errors to go away. The same will be true for Roger Mozbot.
This is all, of course, presuming that pagination is your problem. It could be that there's another issue, but this is definitely worth trying.
-
Thanks Dana.
I am watching the video now.. will be trying to fix it...