Re: pg_dump and thousands of schemas
От | Hugo |
---|---|
Тема | Re: pg_dump and thousands of schemas |
Дата | |
Msg-id | 1338091933763-5710183.post@n5.nabble.com обсуждение исходный текст |
Ответ на | Re: pg_dump and thousands of schemas (Tom Lane <tgl@sss.pgh.pa.us>) |
Ответы |
Re: pg_dump and thousands of schemas
|
Список | pgsql-performance |
Here is a sample dump that takes a long time to be written by pg_dump: http://postgresql.1045698.n5.nabble.com/file/n5710183/test.dump.tar.gz test.dump.tar.gz (the file above has 2.4Mb, the dump itself has 66Mb) This database has 2,311 schemas similar to those in my production database. All schemas are empty, but pg_dump still takes 3 hours to finish it on my computer. So now you can imagine my production database with more than 20,000 schemas like that. Can you guys take a look and see if the code has room for improvements? I generated this dump with postgresql 9.1 (which is what I have on my local computer), but my production database uses postgresql 9.0. So it would be great if improvements could be delivered to version 9.0 as well. Thanks a lot for all the help! Hugo -- View this message in context: http://postgresql.1045698.n5.nabble.com/pg-dump-and-thousands-of-schemas-tp5709766p5710183.html Sent from the PostgreSQL - performance mailing list archive at Nabble.com.
В списке pgsql-performance по дате отправления: