Hi Cyrus,
Here's some more information. Hopefully it will help. All of our product pages are generated using a program. Users enter data into specific fields and gets saved.
Once the publish button is pushed of the program, the data goes to our developers. This isn't live on the web yet as it is just data. The information is dumped into a folder and our developers generate SEO urls and start populating templates with data.
This is what gets published live and indexed.
If I wanted to change the SEO URL. I simply need to specify which fields to generate the new URL from and get the existing URL to redirect to the new one. I can secure a single jump this way and crawlers will never jump from one link to another.
Google just sees this SEO URL and template page.
The problem is that, for efficiency purposes, the website's internal search doesn't use the SEO url, it instead generates a url based on the folder that gets data dumped in. Once a user clicks on this URL, they get redirected to the SEO page. Developers did this for "efficiency reasons"
This is where meta refresh is kicking in. Anyone who clicks on a product from a search gets redirects to the SEO one. SEOMoz is reporting a high meta refresh issue, but my developers are saying it's no problem as the search pages never get indexed.
In my opinion, even though the page isn't getting indexed, the spider is still following and noticing the meta refresh.
After pursuing, they said they can turn off from caching redirects. I'm not well versed in the implications of turning off the caching of redirects. Thus my curious question