managing database with thousands of tables
От | Eugeny N Dzhurinsky |
---|---|
Тема | managing database with thousands of tables |
Дата | |
Msg-id | 20060705130703.GA2428@office.redwerk.com обсуждение исходный текст |
Ответы |
Re: managing database with thousands of tables
|
Список | pgsql-performance |
Hello! I facing some strange problems with PostgreSQL 8.0 performance. I have application which handles a lot of tasks, each task is keps in separate table. Those tables are dropped and created again periodically (precisely - when new task results came back from remote server). Also each table can have hundreds of thousands records inside (but mostly they do have just few thousands). Sometimes I facing performance loss when working with database, and aafter I performed vacuuming on entire database, i saw some tables and indexes in pg_* schemas were optimized and hundreds of thousands records were deleted. Could that be the reason of performance loss, and if so - how can I fix that? I have pg_autovacuum up and running all the time pg_autovacuum -d 3 -D -L /dev/null but it seems pg_autovacuum does not do vacuuming on system tables. -- Eugene Dzhurinsky
В списке pgsql-performance по дате отправления: