Re: Insert 1 million data
От | Olivier Gautherot |
---|---|
Тема | Re: Insert 1 million data |
Дата | |
Msg-id | CAJ7S9TX0smQF6_hNf3EbcPGGeKYQpHwdUmyXsnFo7hhASiq3CA@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: Insert 1 million data (Sreejith P <sreejith@lifetrenz.com>) |
Список | pgsql-admin |
Hi Sreejit,
On Tue, Dec 29, 2020 at 10:56 AM Sreejith P <sreejith@lifetrenz.com> wrote:
Thanks Rohit.
After upgrading volume getting following error. Almost same as previous one.
We have increased backup volume and run the Job Again . When I reach 900 thousand records, getting almost similar error again.
- Do I need to turn off auto vaccum ?
- Shall increase maintance work mem ?
If you're tight on space, my recommendation would be to run the inserts in small batches (say 10,000 at a time). Don't turn off autovaccum, ever :-)
That being said, if you're suffering this way when creating your database, my inclination would be to move it with its logs to a disk with more space. Your server has no scalability and you'll suffer more dramatic crashes very quickly.
My cent worth...
--
Olivier Gautherot
В списке pgsql-admin по дате отправления: