SEO for replicated website system
-
I have a client who has 750 agents. They want to provide them all with a website on a subdomain (mysite.domain.com). The sites will all contain basically the same info, however, this info can be customized on each site by each rep. Most of these reps sell pretty much the same thing, so the customization wont be very dramatic.
So the question is, how can we build this replicated website system and deliver SEO value to each site?
-
put rel=canonical on each profile of agents. Keep you can remove duplicity by creating more & more information about agent.
-
I would send each agent a questionnaire. It would be an email or webform with fields that can be extracted. It would contain questions about the employee's education, experience, and a couple optional personal items. This will give unique content about the agent.
The questionnaire would also ask for information about the office location, phone number and directions on how to get there. Include a map, a photo of the building, a photo of the agent. This will include enough information to optimize the agents page for local search.
Fields in the questionnaire enable it to draw information that will be used by a program to complete a unique title, meta description and
, etc. tags. The goal of this is to construct an optimized page that will draw local traffic
I would not leave it up to the agent to edit this information on the website.
Once a year, each agent will be sent to a questionnaire that contains information from last year. They can edit or approve. New agents will be asked to complete a new form or edit the form of the person who they replace.
Finally, I would not place these on subdomains. They would be organized in folders. Then you can draw information from all of the questionnairs to construct category pages for major geographic areas such as states or major cities.
Done right, these questionnaire can be used to produce agent pages and many other pages that will compete in local, regional and other categories of search.