Re: Need an idea to operate massive delete operation on big size table.
От | Laurenz Albe |
---|---|
Тема | Re: Need an idea to operate massive delete operation on big size table. |
Дата | |
Msg-id | 82cbc03d6bae358db66430f5518abd6b1a683571.camel@cybertec.at обсуждение исходный текст |
Ответ на | Need an idea to operate massive delete operation on big size table. (Gambhir Singh <gambhir.singh05@gmail.com>) |
Ответы |
Re: Need an idea to operate massive delete operation on big size table.
|
Список | pgsql-admin |
On Wed, 2025-01-15 at 20:23 +0530, Gambhir Singh wrote: > I received a request from a client to delete duplicate records from a table which is very large in size. > > Delete queries (~2 Billion) are provided via file, and we have to execute that file in DB. > Last time it lasted for two days. I feel there must be another way to delete records in an efficient manner > > This kind of activity they do every month. I don't think there is a better way - except perhaps to create a new copy of the table and copy the surviving rows to the new table. Than may win if you delete a majority of the rows. For the future, you could consider not adding the duplicate rows rather than deleting them. Perhaps a constraint that prevents the duplicates can help. Yours, Laurenz Albe
В списке pgsql-admin по дате отправления: