Re: VACUUM and ANALYZE With Empty Tables
От | Richard Huxton |
---|---|
Тема | Re: VACUUM and ANALYZE With Empty Tables |
Дата | |
Msg-id | 41A453C7.9090600@archonet.com обсуждение исходный текст |
Ответ на | VACUUM and ANALYZE With Empty Tables ("Mark Dexter" <MDEXTER@dexterchaney.com>) |
Список | pgsql-general |
Mark Dexter wrote: > We use a development environment that works with Postgres via ODBC and > uses cursors to insert and update rows in Postgres tables. I'm using > Postgres version 7.4.5. > A. If I TRUNCATE or DELETE all of the rows in the table and then run > VACUUM or ANALYZE on the empty table, the test program takes over 15 > minutes to complete (i.e., 15X performance drop). > If we routinely run VACUUM or VACUUM ANALYZE (e.g., nightly), these work > tables will normally be empty when the VACUUM is run. So it would > appear from the testing above that they will experience performance > problems when inserting large numbers of rows through our application. Yep - it's a known issue. The analyse is doing what you asked, it's just not what you want. > Is there some easy way around this problem? If there a way to force > VACUUM or ANALYZE to optimize for a set number of rows even if the table > is empty when it is run? Thanks for your help. Mark There are only two options I know of: 1. Vaccum analyse each table separately (tedious, I know) 2. Try pg_autovacuum in the contrib/ directory The autovacuum utility monitors activity for you and targets tables when they've seen a certain amount of activity. Even if it hasn't got the tunability you need, it should be a simple patch to add a list of "excluded" tables. -- Richard Huxton Archonet Ltd
В списке pgsql-general по дате отправления: