Updating large postgresql database with blobs
От | CAJ CAJ |
---|---|
Тема | Updating large postgresql database with blobs |
Дата | |
Msg-id | 467669b30703111718g6e045c28r1812ec256b23ebf0@mail.gmail.com обсуждение исходный текст |
Ответы |
Re: Updating large postgresql database with blobs
|
Список | pgsql-hackers |
Hello, <br /><br />I didn't get any response on the GENERAL list so i'm escalating this ....<br /><br />We have several independentdatabase servers with ~50GB+ databases running postgres 8.0.x. We are planning to upgrade these databases to postgres8.2.x over the weekend <br /><br />We plan to use the following steps to upgrade each server, <br /><br />1. Dumpthe 8.0.x database cluster using 8.2.x pg_dumpall<br />% ./pg_dumpall > pgdumpall_backup.sql<br /><br />2.Dump the8.0.x database including large objects in compressed custom format using 8.2.x pg_dump <br /> % ./pg_dump -Fc -b -Z9dbname > pgdump_lobs_backup<br /><br /><br />Restoring database<br />1. Initialize 8.2.x darabase<br />% initdb -D/data/pgdata<br /><br />2. Restore template1 database from cluster dump<br />% ./psql -d template1 < pgdumpall_backup.sql<br /><br />3. Delete database dbname else restoring will give error about existing dbname<br />% dropdbdbname<br /><br />4. Create fresh dbname<br />% createdb -O dbowner dbname<br /><br />5. Restore database with lobs<br/>% ./pg_restore -v -Fc -d dbname -e -U dbowner < pgdumpall_lobs_backup <br /><br />Some of the problems we haveare,<br />1. We are not sure if all of the data will be available after dump/restore with above process<br />2. The dumpand restore process is very very slow to be complete over the weekend (takes approx 1GB/hr to dump on a dual G5 PPC 2Ghzwith 1GB RAM and RAID 1 disks) <br /><br />What is the fastest way to upgrade postgres for large databases that has binaryobjects?<br /><br />Thanks for all your help.
В списке pgsql-hackers по дате отправления: