Re: Strategies/Best Practises Handling Large Tables
От | Chris Travers |
---|---|
Тема | Re: Strategies/Best Practises Handling Large Tables |
Дата | |
Msg-id | CAKt_Zfs64hDKsJTTj0YkYLLE7W38KKbp6pDVJ8edj=U-LcBeTA@mail.gmail.com обсуждение исходный текст |
Ответ на | Strategies/Best Practises Handling Large Tables (Chitra Creta <chitracreta@gmail.com>) |
Ответы |
Re: Strategies/Best Practises Handling Large Tables
|
Список | pgsql-general |
Hi,
I currently have a table that is growing very quickly - i.e 7 million records in 5 days. This table acts as a placeholder for statistics, and hence the records are merely inserted and never updated or deleted.
Many queries are run on this table to obtain trend analysis. However, these queries are now starting to take a very long time (hours) to execute due to the size of the table.
I have put indexes on this table, to no significant benefit. Some of the other strategies I have thought of:
1. Purge old data
2. Reindex
3. Partition
4. Creation of daily, monthly, yearly summary tables that contains aggregated data specific to the statistics requiredDoes anyone know what is the best practice to handle this situation?
В списке pgsql-general по дате отправления: