Vipin
My understanding is that if you block anything from Robots.txt that you prevent link juice from flowing for the pages. If the pages in question have links to them then I agree with Highland to 301 redirect them to the correct pages.
shivun
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Vipin
My understanding is that if you block anything from Robots.txt that you prevent link juice from flowing for the pages. If the pages in question have links to them then I agree with Highland to 301 redirect them to the correct pages.
shivun
Adding noindex to search (dynamically) created pages will prevent the problem ever existing. However not all CMS tools allow or this to be done.
Hi
I am in the process of writing meta titles and meta descriptions for a clients product portfolio. I have the complete list exported from the cms system in to excel.
I know I can add some clever logic to build the meta information but for this purpose I want to do this manually. Does anyone have a editor online or an excel sheet that shows you the charactor limits when writing the titles and descriptions.
This would be handy if someone has, otherwise it's a case of having to write the excel sheet to do this, not that hard to do but i thought i'd ask the question.
I have a website that requires the site structure to be changed. The website doesnt have many backlnks and rankings are fairly low. I have 11,000 products on the website and want to know the best way to change the site structure without causing 404 errors all over the place. Do I 301 redirect every page? drop all 11,000 pages from the index by adding a no follow no index to all pages?
I have the following structure
www.domain.co.uk/make/model/part/product
I want to change this to
www.domain.co.uk/Part/make/model/product
whats the best way to preserve the SEO, link juice and on a large scale? 11,000 pages.
thank you
shivun