This would usually not be a problem as to make a change to the database I would alter the sql that creates it, drop the database, recreate it, load up the test data and continue work. This is good and dandy, however...
The project is about to go live (web-based shop) and my problem is that on the live system data will need to be kept over the changes.
My problem is how to go about making these changes to the database structure whilst making it possible to apply them to a production system. I have come up with the following possible methods but would like some feed back on them before commiting to one.
- Transfer script The old database is kept and a new one is created. Data is then transferred from the old one to the new one using a perl script which also takes into account any structural changes.
- Make changes using SQL commands All changes to the database are made using 'ALTER table ADD column' commands so that all the data remains in place. Essentially a series of patches are applied.
- Store data and reload it All the data is stored to file (possibly using XML or YAML), the database is recreated and the data is then reloaded into the new database.
There are benefits and problems to each one. All of them would require keeping track of which version of the structure is currently being used and then acting accordingly.
As I see it this is a general database based project problem but I have not seen much discussion on it here or elsewhere (please correct if wrong). For the record I am using Postgres 7.4.
I am personally tending towards using sql commands which are stored in files named something like 'alter-2.3-2.4.sql' which would change the database from version 2.3 to 2.4.
Another consideration: How can all this be tested?
--tidiness is the memory loss of environmental mnemonics