Database Driven Websites: Crawling and Indexing Issues
-
Hi all - I'm working on an SEO project, dealing with my first database-driven website that is built on a custom CMS. Almost all of the pages are created by the admin user in the CMS, pulling info from a database.
What are the best practices here regarding SEO? I know that overall static is good, and as much static as possible is best, but how does Google treat a site like this?
For instance, lets say the user creates a new page in the CMS, and then posts it live. The page is rendered and navigable, after putting together the user-inputed info (the content on the page) and the info pulled from the database (like info pulled out to create the Title tag and H1 tags, etc). Is this page now going to be crawled successfully and indexed as a static page in Google's eyes, and thus ok to start working on rank for, etc?
Any help is appreciated - thanks!
-
Almost everything these days is DB driven! I see no issues and the same optimization rules apply. Yes, google just sees the final rendered version of the page so aside from potentially slower load times, it all looks the same!
Once it's crawl-able, go to town!
-
So even though that page is not technically an actual page (HTML file) that sits on the server permanently, Google will index it as such?