how to handle a big table for data log

Поиск
Список
Период
Сортировка
От kuopo
Тема how to handle a big table for data log
Дата
Msg-id AANLkTilCP3sGTbHIvrM-ixG-P1Dz6ToqrhbgijY8m8V8@mail.gmail.com
обсуждение исходный текст
Ответы Re: how to handle a big table for data log  ("Jorge Montero" <jorge_montero@homedecorators.com>)
Список pgsql-performance
Hi,

I have a situation to handle a log table which would accumulate a
large amount of logs. This table only involves insert and query
operations. To limit the table size, I tried to split this table by
date. However, the number of the logs is still large (46 million
records per day). To further limit its size, I tried to split this log
table by log type. However, this action does not improve the
performance. It is much slower than the big table solution. I guess
this is because I need to pay more cost on the auto-vacuum/analyze for
all split tables.

Can anyone comment on this situation? Thanks in advance.


kuopo.

В списке pgsql-performance по дате отправления:

Предыдущее
От: "Kevin Grittner"
Дата:
Сообщение: Re: IDE x SAS RAID 0 on HP DL 380 G5 P400i controller performance problem
Следующее
От: Daniel Ferreira de Lima
Дата:
Сообщение: Re: IDE x SAS RAID 0 on HP DL 380 G5 P400i controller performance problem