Re: Best practices for migrating a development database to a
От | Vivek Khera |
---|---|
Тема | Re: Best practices for migrating a development database to a |
Дата | |
Msg-id | x7y8jcu4jl.fsf@yertle.int.kciLink.com обсуждение исходный текст |
Ответ на | Best practices for migrating a development database to a release database (Collin Peters <cpeters@mcrt.ca>) |
Список | pgsql-general |
>>>>> "CP" == Collin Peters <cpeters@mcrt.ca> writes: CP> I have thought of the following situations: CP> -Simply track all the changes you made to the development database and CP> make the same changes to the release database CP> -Back up the release database... overwrite it with the development CP> database... then copy all your real data back into the release CP> database (this last step is probably quite difficult) CP> -Perhaps some combination of the two You need one more layer: the staging server. What we do is develop on local workstations, prepare release on a staging server, then push the staging server info to the production box, or run the same updating script on production. Any schema changes are done via scripts within transactions. The renames, alters, grants, etc., are all tested on the staging server with a current copy (pg_dump/restore) from the live server so we know there won't be any surprizes on the live data (or close to it). It also lets us know how long some things might take. For example, this weekend we need to add a primary key to a 65 million row table that just logs events. Until now it really didn't need a PK since it was never updated and the queries were all aggregates. However, to run slony replication it needs a PK... The test procedure of doing it on the staging server pointed out some flaws in the conversion script that were not noticed when running on the development server because the dataset was so small. These flaws would have made the DB unusable for something like 5 days (if it ever completed -- I don't know because I aborted that test) while the update occurred, and once done would leave the application without access to the revised table. Naturally, we found better ways to do it that have trimmed the expected time down to about 1.5 hours or less. You really have to take each situation separately. The easy way of the PK adding script works fine on tables up to about 60k or 100k rows, so we used that on some other smaller tables. -- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= Vivek Khera, Ph.D. Khera Communications, Inc. Internet: khera@kciLink.com Rockville, MD +1-301-869-4449 x806 AIM: vivekkhera Y!: vivek_khera http://www.khera.org/~vivek/
В списке pgsql-general по дате отправления: